Feb 01 06:42:48 crc systemd[1]: Starting Kubernetes Kubelet... Feb 01 06:42:48 crc restorecon[4471]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:48 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:42:49 crc restorecon[4471]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 01 06:42:49 crc kubenswrapper[4546]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.533745 4546 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536546 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536564 4546 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536569 4546 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536573 4546 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536577 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536581 4546 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536586 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536590 4546 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536593 4546 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536597 4546 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536601 4546 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536605 4546 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536608 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536611 4546 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536614 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536617 4546 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536620 4546 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536624 4546 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536627 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536630 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536633 4546 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536636 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536639 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536642 4546 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536650 4546 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536654 4546 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536657 4546 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536660 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536664 4546 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536668 4546 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536672 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536675 4546 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536678 4546 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536682 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536686 4546 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536689 4546 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536693 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536699 4546 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536704 4546 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536709 4546 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536713 4546 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536717 4546 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536720 4546 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536724 4546 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536728 4546 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536731 4546 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536735 4546 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536738 4546 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536741 4546 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536745 4546 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536748 4546 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536752 4546 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536755 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536758 4546 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536762 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536765 4546 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536769 4546 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536772 4546 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536776 4546 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536779 4546 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536784 4546 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536787 4546 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536791 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536795 4546 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536799 4546 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536803 4546 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536806 4546 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536810 4546 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536813 4546 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536817 4546 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.536820 4546 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537400 4546 flags.go:64] FLAG: --address="0.0.0.0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537412 4546 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537419 4546 flags.go:64] FLAG: --anonymous-auth="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537424 4546 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537429 4546 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537432 4546 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537437 4546 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537441 4546 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537445 4546 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537449 4546 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537453 4546 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537457 4546 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537461 4546 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537464 4546 flags.go:64] FLAG: --cgroup-root="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537468 4546 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537472 4546 flags.go:64] FLAG: --client-ca-file="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537476 4546 flags.go:64] FLAG: --cloud-config="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537480 4546 flags.go:64] FLAG: --cloud-provider="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537484 4546 flags.go:64] FLAG: --cluster-dns="[]" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537489 4546 flags.go:64] FLAG: --cluster-domain="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537493 4546 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537497 4546 flags.go:64] FLAG: --config-dir="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537501 4546 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537505 4546 flags.go:64] FLAG: --container-log-max-files="5" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537510 4546 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537514 4546 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537517 4546 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537521 4546 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537525 4546 flags.go:64] FLAG: --contention-profiling="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537529 4546 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537533 4546 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537537 4546 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537540 4546 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537545 4546 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537548 4546 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537552 4546 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537556 4546 flags.go:64] FLAG: --enable-load-reader="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537560 4546 flags.go:64] FLAG: --enable-server="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537563 4546 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537568 4546 flags.go:64] FLAG: --event-burst="100" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537572 4546 flags.go:64] FLAG: --event-qps="50" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537576 4546 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537579 4546 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537583 4546 flags.go:64] FLAG: --eviction-hard="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537587 4546 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537591 4546 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537595 4546 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537598 4546 flags.go:64] FLAG: --eviction-soft="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537602 4546 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537606 4546 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537609 4546 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537613 4546 flags.go:64] FLAG: --experimental-mounter-path="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537617 4546 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537620 4546 flags.go:64] FLAG: --fail-swap-on="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537624 4546 flags.go:64] FLAG: --feature-gates="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537628 4546 flags.go:64] FLAG: --file-check-frequency="20s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537632 4546 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537635 4546 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537639 4546 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537644 4546 flags.go:64] FLAG: --healthz-port="10248" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537648 4546 flags.go:64] FLAG: --help="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537653 4546 flags.go:64] FLAG: --hostname-override="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537657 4546 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537660 4546 flags.go:64] FLAG: --http-check-frequency="20s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537664 4546 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537667 4546 flags.go:64] FLAG: --image-credential-provider-config="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537671 4546 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537675 4546 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537678 4546 flags.go:64] FLAG: --image-service-endpoint="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537682 4546 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537685 4546 flags.go:64] FLAG: --kube-api-burst="100" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537689 4546 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537693 4546 flags.go:64] FLAG: --kube-api-qps="50" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537697 4546 flags.go:64] FLAG: --kube-reserved="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537701 4546 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537705 4546 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537708 4546 flags.go:64] FLAG: --kubelet-cgroups="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537712 4546 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537716 4546 flags.go:64] FLAG: --lock-file="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537719 4546 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537723 4546 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537726 4546 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537736 4546 flags.go:64] FLAG: --log-json-split-stream="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537740 4546 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537743 4546 flags.go:64] FLAG: --log-text-split-stream="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537747 4546 flags.go:64] FLAG: --logging-format="text" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537750 4546 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537754 4546 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537758 4546 flags.go:64] FLAG: --manifest-url="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537761 4546 flags.go:64] FLAG: --manifest-url-header="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537766 4546 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537770 4546 flags.go:64] FLAG: --max-open-files="1000000" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537774 4546 flags.go:64] FLAG: --max-pods="110" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537779 4546 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537783 4546 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537786 4546 flags.go:64] FLAG: --memory-manager-policy="None" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537790 4546 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537794 4546 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537798 4546 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537802 4546 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537809 4546 flags.go:64] FLAG: --node-status-max-images="50" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537813 4546 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537817 4546 flags.go:64] FLAG: --oom-score-adj="-999" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537821 4546 flags.go:64] FLAG: --pod-cidr="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537824 4546 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537831 4546 flags.go:64] FLAG: --pod-manifest-path="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537835 4546 flags.go:64] FLAG: --pod-max-pids="-1" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537838 4546 flags.go:64] FLAG: --pods-per-core="0" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537842 4546 flags.go:64] FLAG: --port="10250" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537846 4546 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537851 4546 flags.go:64] FLAG: --provider-id="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537868 4546 flags.go:64] FLAG: --qos-reserved="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537873 4546 flags.go:64] FLAG: --read-only-port="10255" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537877 4546 flags.go:64] FLAG: --register-node="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537881 4546 flags.go:64] FLAG: --register-schedulable="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537885 4546 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537891 4546 flags.go:64] FLAG: --registry-burst="10" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537895 4546 flags.go:64] FLAG: --registry-qps="5" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537899 4546 flags.go:64] FLAG: --reserved-cpus="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537903 4546 flags.go:64] FLAG: --reserved-memory="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537907 4546 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537911 4546 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537914 4546 flags.go:64] FLAG: --rotate-certificates="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537918 4546 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537923 4546 flags.go:64] FLAG: --runonce="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537926 4546 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537930 4546 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537934 4546 flags.go:64] FLAG: --seccomp-default="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537937 4546 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537941 4546 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537945 4546 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537948 4546 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537952 4546 flags.go:64] FLAG: --storage-driver-password="root" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537956 4546 flags.go:64] FLAG: --storage-driver-secure="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537959 4546 flags.go:64] FLAG: --storage-driver-table="stats" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537963 4546 flags.go:64] FLAG: --storage-driver-user="root" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537966 4546 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537970 4546 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537973 4546 flags.go:64] FLAG: --system-cgroups="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537977 4546 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537982 4546 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537985 4546 flags.go:64] FLAG: --tls-cert-file="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537989 4546 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537993 4546 flags.go:64] FLAG: --tls-min-version="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.537997 4546 flags.go:64] FLAG: --tls-private-key-file="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538001 4546 flags.go:64] FLAG: --topology-manager-policy="none" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538004 4546 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538008 4546 flags.go:64] FLAG: --topology-manager-scope="container" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538011 4546 flags.go:64] FLAG: --v="2" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538016 4546 flags.go:64] FLAG: --version="false" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538020 4546 flags.go:64] FLAG: --vmodule="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538024 4546 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538028 4546 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538124 4546 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538129 4546 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538133 4546 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538136 4546 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538139 4546 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538143 4546 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538146 4546 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538150 4546 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538153 4546 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538157 4546 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538161 4546 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538165 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538176 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538180 4546 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538183 4546 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538186 4546 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538189 4546 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538193 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538197 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538200 4546 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538203 4546 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538208 4546 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538212 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538215 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538219 4546 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538222 4546 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538225 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538229 4546 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538235 4546 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538238 4546 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538242 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538245 4546 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538249 4546 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538254 4546 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538258 4546 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538262 4546 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538266 4546 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538269 4546 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538272 4546 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538276 4546 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538279 4546 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538282 4546 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538286 4546 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538289 4546 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538292 4546 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538296 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538299 4546 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538303 4546 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538307 4546 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538311 4546 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538314 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538318 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538322 4546 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538326 4546 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538329 4546 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538332 4546 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538335 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538338 4546 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538341 4546 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538345 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538348 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538351 4546 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538356 4546 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538363 4546 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538367 4546 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538370 4546 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538374 4546 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538377 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538381 4546 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538384 4546 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.538388 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.538398 4546 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.544416 4546 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.544447 4546 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544510 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544521 4546 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544525 4546 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544529 4546 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544533 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544537 4546 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544541 4546 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544545 4546 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544549 4546 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544552 4546 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544556 4546 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544559 4546 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544563 4546 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544566 4546 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544570 4546 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544574 4546 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544578 4546 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544582 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544585 4546 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544588 4546 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544592 4546 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544596 4546 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544601 4546 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544606 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544610 4546 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544614 4546 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544618 4546 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544622 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544625 4546 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544628 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544632 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544635 4546 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544639 4546 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544642 4546 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544646 4546 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544650 4546 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544653 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544656 4546 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544660 4546 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544663 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544666 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544669 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544672 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544676 4546 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544679 4546 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544682 4546 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544685 4546 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544688 4546 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544691 4546 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544695 4546 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544699 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544702 4546 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544706 4546 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544709 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544712 4546 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544717 4546 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544721 4546 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544724 4546 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544727 4546 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544730 4546 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544734 4546 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544737 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544741 4546 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544744 4546 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544747 4546 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544750 4546 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544754 4546 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544757 4546 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544762 4546 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544766 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544770 4546 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.544776 4546 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544900 4546 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544906 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544911 4546 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544915 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544918 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544922 4546 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544925 4546 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544929 4546 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544933 4546 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544937 4546 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544941 4546 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544944 4546 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544948 4546 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544951 4546 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544954 4546 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544958 4546 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544962 4546 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544966 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544969 4546 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544972 4546 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544975 4546 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544978 4546 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544981 4546 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544984 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544987 4546 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544990 4546 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544993 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.544997 4546 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545000 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545004 4546 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545009 4546 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545012 4546 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545016 4546 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545020 4546 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545024 4546 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545027 4546 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545030 4546 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545033 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545037 4546 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545041 4546 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545045 4546 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545049 4546 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545052 4546 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545057 4546 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545061 4546 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545065 4546 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545069 4546 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545074 4546 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545079 4546 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545083 4546 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545086 4546 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545089 4546 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545092 4546 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545095 4546 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545099 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545102 4546 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545105 4546 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545108 4546 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545111 4546 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545114 4546 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545117 4546 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545121 4546 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545125 4546 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545129 4546 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545133 4546 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545136 4546 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545139 4546 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545142 4546 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545146 4546 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545149 4546 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.545152 4546 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.545158 4546 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.545289 4546 server.go:940] "Client rotation is on, will bootstrap in background" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.548234 4546 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.548499 4546 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.549582 4546 server.go:997] "Starting client certificate rotation" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.549610 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.549739 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-01 03:15:21.887700601 +0000 UTC Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.549804 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.560343 4546 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.562584 4546 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.563339 4546 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.570394 4546 log.go:25] "Validated CRI v1 runtime API" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.586522 4546 log.go:25] "Validated CRI v1 image API" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.587479 4546 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.590066 4546 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-01-06-39-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.590087 4546 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.599827 4546 manager.go:217] Machine: {Timestamp:2026-02-01 06:42:49.598577781 +0000 UTC m=+0.249513807 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2445404 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9a98126f-f656-4047-9b34-a8185f08b8ca BootID:fa428684-6fb6-45d8-b94c-216375fbfbe8 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:66:a7:de Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:66:a7:de Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:e0:98:3f Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:ab:07:b7 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:63:17:4a Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:2b:54:73 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:d6:06:14:4f:fa:f2 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4e:23:af:fd:b2:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.599972 4546 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.600056 4546 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.600930 4546 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601073 4546 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601098 4546 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601272 4546 topology_manager.go:138] "Creating topology manager with none policy" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601282 4546 container_manager_linux.go:303] "Creating device plugin manager" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601581 4546 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601607 4546 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.601995 4546 state_mem.go:36] "Initialized new in-memory state store" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.602067 4546 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.603707 4546 kubelet.go:418] "Attempting to sync node with API server" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.603726 4546 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.603817 4546 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.603835 4546 kubelet.go:324] "Adding apiserver pod source" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.603845 4546 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.606350 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.606397 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.606389 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.606434 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.606825 4546 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.607281 4546 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.610251 4546 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611447 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611469 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611476 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611483 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611493 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611498 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611504 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611513 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611520 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611526 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611535 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611542 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611557 4546 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.611817 4546 server.go:1280] "Started kubelet" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.612377 4546 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.612379 4546 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.612566 4546 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:49 crc systemd[1]: Started Kubernetes Kubelet. Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.613040 4546 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.615341 4546 server.go:460] "Adding debug handlers to kubelet server" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.615432 4546 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18900c4af3ab25cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:42:49.611797967 +0000 UTC m=+0.262733984,LastTimestamp:2026-02-01 06:42:49.611797967 +0000 UTC m=+0.262733984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.616957 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.616983 4546 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.617114 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 09:23:43.404623113 +0000 UTC Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.617397 4546 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.617412 4546 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.617497 4546 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.617597 4546 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.617897 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="200ms" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.618058 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.618095 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.619780 4546 factory.go:55] Registering systemd factory Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.619799 4546 factory.go:221] Registration of the systemd container factory successfully Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.621840 4546 factory.go:153] Registering CRI-O factory Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.621872 4546 factory.go:221] Registration of the crio container factory successfully Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.621923 4546 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.621940 4546 factory.go:103] Registering Raw factory Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.621973 4546 manager.go:1196] Started watching for new ooms in manager Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622385 4546 manager.go:319] Starting recovery of all containers Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622505 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622549 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622560 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622569 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622577 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622584 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622592 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622599 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622609 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622617 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622625 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622635 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622643 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622652 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622659 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622667 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622691 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622699 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622707 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622717 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622725 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622733 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622753 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622762 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622769 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622778 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622787 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622797 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622805 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622813 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622820 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622828 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622835 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622843 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622850 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622884 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622893 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622902 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622910 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622918 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622925 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622934 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622941 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622948 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622956 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622964 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622971 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622980 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622988 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.622996 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623003 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623012 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623023 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623032 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623042 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623050 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623059 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623080 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623089 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623100 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623108 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623115 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623123 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623131 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623140 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623148 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623155 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623172 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623181 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623189 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623196 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623204 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623212 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623219 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623227 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623235 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623242 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623250 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623257 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623265 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623273 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623280 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623288 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623296 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623303 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623311 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623320 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623328 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623336 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623345 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623353 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623362 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623370 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623377 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623384 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623392 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623401 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623408 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623417 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623424 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623433 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623440 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623447 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623455 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623467 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623496 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623507 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623516 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623526 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623534 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623544 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623552 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623561 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623570 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623579 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623587 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623595 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623603 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623611 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623619 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623626 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623633 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623641 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623648 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623656 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623665 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623672 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623681 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623689 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623696 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623703 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623711 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623718 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623725 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623733 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623740 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623748 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623756 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623763 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623770 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623777 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623784 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623792 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623804 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623812 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623822 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623830 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623837 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623845 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623888 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623897 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623905 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623913 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623921 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623929 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623936 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623944 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623952 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623959 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623967 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623976 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623983 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.623992 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624000 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624007 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624014 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624022 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624029 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624037 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624044 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624053 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624060 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624068 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624075 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624082 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624091 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624100 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624110 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624118 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624125 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624132 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624141 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624149 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624157 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624173 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624181 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624189 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624196 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.624204 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631355 4546 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631404 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631461 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631487 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631498 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631511 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631521 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631530 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631541 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631549 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631562 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631576 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631585 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631597 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631605 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631614 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631625 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631633 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631645 4546 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631654 4546 reconstruct.go:97] "Volume reconstruction finished" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.631660 4546 reconciler.go:26] "Reconciler: start to sync state" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.635110 4546 manager.go:324] Recovery completed Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.643763 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.644720 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.644751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.644766 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.646913 4546 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.646932 4546 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.646947 4546 state_mem.go:36] "Initialized new in-memory state store" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.651823 4546 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.652433 4546 policy_none.go:49] "None policy: Start" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.653606 4546 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.653633 4546 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.653651 4546 kubelet.go:2335] "Starting kubelet main sync loop" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.653683 4546 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.654758 4546 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.654781 4546 state_mem.go:35] "Initializing new in-memory state store" Feb 01 06:42:49 crc kubenswrapper[4546]: W0201 06:42:49.654803 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.654845 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.692570 4546 manager.go:334] "Starting Device Plugin manager" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.692606 4546 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.692616 4546 server.go:79] "Starting device plugin registration server" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.692956 4546 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.692968 4546 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.693114 4546 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.693205 4546 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.693227 4546 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.700135 4546 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.753939 4546 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754010 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754657 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754683 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754709 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754800 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754947 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.754984 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755395 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755435 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755447 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755633 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755742 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.755772 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756130 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756163 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756556 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756577 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756584 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756597 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756609 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756616 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756707 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756823 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.756873 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757314 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757323 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757408 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757521 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757548 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757700 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757740 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.757752 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758281 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758318 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758331 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758336 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758341 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758876 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758898 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.758919 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.793639 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.794289 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.794310 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.794318 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.794332 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.794681 4546 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.196:6443: connect: connection refused" node="crc" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.818506 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="400ms" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833665 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833731 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833807 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833852 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833885 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833912 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.833972 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834002 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834032 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834056 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834078 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834092 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834105 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834118 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.834170 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934829 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934880 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934897 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934923 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934935 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934965 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.934979 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935051 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935059 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935094 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935113 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935170 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935204 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935231 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935242 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935308 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935291 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935245 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935345 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935392 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935399 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935425 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935433 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935441 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935443 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935458 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935476 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.935493 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.936242 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.995333 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.996875 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.996908 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.996917 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:49 crc kubenswrapper[4546]: I0201 06:42:49.996954 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:42:49 crc kubenswrapper[4546]: E0201 06:42:49.997558 4546 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.196:6443: connect: connection refused" node="crc" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.085233 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.089929 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.107215 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-69e8cbaf16640dc8405230e53aa482a5d9db2fe8c95ec5af36063d17431f6e6c WatchSource:0}: Error finding container 69e8cbaf16640dc8405230e53aa482a5d9db2fe8c95ec5af36063d17431f6e6c: Status 404 returned error can't find the container with id 69e8cbaf16640dc8405230e53aa482a5d9db2fe8c95ec5af36063d17431f6e6c Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.108349 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.109487 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a52d82e25fd446a34270a637b2053cc309c5fa1aac002f7a755a617f300c73af WatchSource:0}: Error finding container a52d82e25fd446a34270a637b2053cc309c5fa1aac002f7a755a617f300c73af: Status 404 returned error can't find the container with id a52d82e25fd446a34270a637b2053cc309c5fa1aac002f7a755a617f300c73af Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.115294 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.118115 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.121428 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0e77802031f953a4743ac60c63a6c068b8be7167ea410edb36106003f94321ab WatchSource:0}: Error finding container 0e77802031f953a4743ac60c63a6c068b8be7167ea410edb36106003f94321ab: Status 404 returned error can't find the container with id 0e77802031f953a4743ac60c63a6c068b8be7167ea410edb36106003f94321ab Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.129911 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1939b89272132ad0a365a745e56d5603ec0add673082ffc8940fd7c55f2d0432 WatchSource:0}: Error finding container 1939b89272132ad0a365a745e56d5603ec0add673082ffc8940fd7c55f2d0432: Status 404 returned error can't find the container with id 1939b89272132ad0a365a745e56d5603ec0add673082ffc8940fd7c55f2d0432 Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.136769 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-be7744a6de911d006ee54b6bbca74a02941cfcffcda454bf331ffc7e59709e1e WatchSource:0}: Error finding container be7744a6de911d006ee54b6bbca74a02941cfcffcda454bf331ffc7e59709e1e: Status 404 returned error can't find the container with id be7744a6de911d006ee54b6bbca74a02941cfcffcda454bf331ffc7e59709e1e Feb 01 06:42:50 crc kubenswrapper[4546]: E0201 06:42:50.219204 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="800ms" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.398403 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.399300 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.399330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.399339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.399357 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:42:50 crc kubenswrapper[4546]: E0201 06:42:50.399618 4546 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.196:6443: connect: connection refused" node="crc" Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.443795 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:50 crc kubenswrapper[4546]: E0201 06:42:50.444068 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.614055 4546 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.618174 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 22:05:41.506158924 +0000 UTC Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.657608 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd" exitCode=0 Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.657683 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.657767 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e77802031f953a4743ac60c63a6c068b8be7167ea410edb36106003f94321ab"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.657885 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658764 4546 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="33d6c0a0a888216f28a46b19f389fee88bfa5e8ac63e12d04b1d066cb899612b" exitCode=0 Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658825 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"33d6c0a0a888216f28a46b19f389fee88bfa5e8ac63e12d04b1d066cb899612b"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658870 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a52d82e25fd446a34270a637b2053cc309c5fa1aac002f7a755a617f300c73af"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658882 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658899 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658908 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.658970 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.659728 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.659756 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.659765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.660418 4546 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="87865a25eb5c15a9aa3c39a76bedd450341862f78796be8dafcaf2547641b7d2" exitCode=0 Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.660463 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"87865a25eb5c15a9aa3c39a76bedd450341862f78796be8dafcaf2547641b7d2"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.660477 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"69e8cbaf16640dc8405230e53aa482a5d9db2fe8c95ec5af36063d17431f6e6c"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.660517 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.660966 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661349 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661371 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661379 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661468 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661482 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.661489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.662410 4546 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6" exitCode=0 Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.662456 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.662769 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be7744a6de911d006ee54b6bbca74a02941cfcffcda454bf331ffc7e59709e1e"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.662838 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.663694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.663753 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.663764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.664412 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a"} Feb 01 06:42:50 crc kubenswrapper[4546]: I0201 06:42:50.664432 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1939b89272132ad0a365a745e56d5603ec0add673082ffc8940fd7c55f2d0432"} Feb 01 06:42:50 crc kubenswrapper[4546]: W0201 06:42:50.839328 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:50 crc kubenswrapper[4546]: E0201 06:42:50.839388 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:51 crc kubenswrapper[4546]: E0201 06:42:51.020332 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="1.6s" Feb 01 06:42:51 crc kubenswrapper[4546]: W0201 06:42:51.031006 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:51 crc kubenswrapper[4546]: E0201 06:42:51.031057 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:51 crc kubenswrapper[4546]: W0201 06:42:51.056989 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.196:6443: connect: connection refused Feb 01 06:42:51 crc kubenswrapper[4546]: E0201 06:42:51.057034 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.196:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.200012 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.201430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.201475 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.201486 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.201518 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:42:51 crc kubenswrapper[4546]: E0201 06:42:51.202036 4546 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.196:6443: connect: connection refused" node="crc" Feb 01 06:42:51 crc kubenswrapper[4546]: E0201 06:42:51.423272 4546 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18900c4af3ab25cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:42:49.611797967 +0000 UTC m=+0.262733984,LastTimestamp:2026-02-01 06:42:49.611797967 +0000 UTC m=+0.262733984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.618905 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 07:24:09.246053832 +0000 UTC Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.668433 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.668507 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.668526 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.668453 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.669301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.669334 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.669345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670473 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670502 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670513 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670522 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670529 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670591 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.670989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.671015 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.671024 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.671734 4546 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fde11d91276bc7b6936472172de2061febc6ced6c68a3cd781f3610125fc39e7" exitCode=0 Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.671801 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fde11d91276bc7b6936472172de2061febc6ced6c68a3cd781f3610125fc39e7"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.671937 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.672477 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.672502 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.672512 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.672914 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e9ca109332035cb3553f13bf64fcd53687b226c671d48b29ef934739a900664a"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.672968 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.673645 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.673669 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.673679 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.693053 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.693095 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.693107 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1"} Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.693176 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.695164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.695298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.695387 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:51 crc kubenswrapper[4546]: I0201 06:42:51.730081 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.037206 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.042719 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.042761 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.619160 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:08:49.435448188 +0000 UTC Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.697584 4546 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ef55bffac575c852f3c7f00a4ea5ec47075f3ae4fc9793236f1f4011ad7edf52" exitCode=0 Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.697666 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ef55bffac575c852f3c7f00a4ea5ec47075f3ae4fc9793236f1f4011ad7edf52"} Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.697894 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.697920 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.698063 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.698169 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699217 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699259 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699274 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699234 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699509 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699538 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699610 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.699631 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.802581 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.803546 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.803586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.803597 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:52 crc kubenswrapper[4546]: I0201 06:42:52.803621 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.619570 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:55:55.731223539 +0000 UTC Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.704947 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a22e630704c60205eb87d6795d9c01d0ec198ddf3375ab83aed2295736f248ce"} Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705048 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.704993 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705127 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705135 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705051 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2071a487e44d673048342fe3e758ff36ae48ca02ae932fdcb8c2bf3848b3031f"} Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705211 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"54cf16b91823879f81f7a375dd216213228b70d71ed82f2be07a74fe565e4ad7"} Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705236 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8e9dd222a206fdefb62ab3df4c8aed031e1fb2dbabf7115294a81a6365f33da0"} Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705247 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53cc8e338e4786c916278223fbb1ab3dd0bfe7458644a37e68427f6dcf317e94"} Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.705276 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.706345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.706434 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.706489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707261 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707309 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707334 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707364 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.707419 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:53 crc kubenswrapper[4546]: I0201 06:42:53.741913 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.424906 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.619657 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:20:49.945157254 +0000 UTC Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.707047 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.707091 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.707284 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708111 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708123 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708245 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708314 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:54 crc kubenswrapper[4546]: I0201 06:42:54.708381 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.620154 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 13:52:24.413650123 +0000 UTC Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.708671 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.709506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.709570 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.709583 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.946839 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.947089 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.948017 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.948110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:55 crc kubenswrapper[4546]: I0201 06:42:55.948177 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.599247 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.599374 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.599420 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.600425 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.600520 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.600585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.620261 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:44:59.276544825 +0000 UTC Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.651667 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.651794 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.652527 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.652560 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:56 crc kubenswrapper[4546]: I0201 06:42:56.652571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:42:57 crc kubenswrapper[4546]: I0201 06:42:57.620999 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:53:45.369424535 +0000 UTC Feb 01 06:42:58 crc kubenswrapper[4546]: I0201 06:42:58.621930 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:05:36.252317371 +0000 UTC Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.599972 4546 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.600051 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.622277 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 11:08:55.871007953 +0000 UTC Feb 01 06:42:59 crc kubenswrapper[4546]: E0201 06:42:59.700219 4546 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.911460 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.911586 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.912455 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.912485 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:42:59 crc kubenswrapper[4546]: I0201 06:42:59.912494 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:00 crc kubenswrapper[4546]: I0201 06:43:00.623064 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 23:01:44.682699844 +0000 UTC Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.114426 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.114579 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.115464 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.115513 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.115523 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.118181 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.614362 4546 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.623618 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 20:48:04.075550287 +0000 UTC Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.721439 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.722368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.722414 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:01 crc kubenswrapper[4546]: I0201 06:43:01.722426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:01 crc kubenswrapper[4546]: E0201 06:43:01.731879 4546 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 01 06:43:02 crc kubenswrapper[4546]: W0201 06:43:02.228046 4546 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.228166 4546 trace.go:236] Trace[1603033525]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:42:52.226) (total time: 10001ms): Feb 01 06:43:02 crc kubenswrapper[4546]: Trace[1603033525]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:43:02.228) Feb 01 06:43:02 crc kubenswrapper[4546]: Trace[1603033525]: [10.001554008s] [10.001554008s] END Feb 01 06:43:02 crc kubenswrapper[4546]: E0201 06:43:02.228200 4546 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.348765 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.348826 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.356391 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.356453 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.526102 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.526358 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.527607 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.527689 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.527756 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.558875 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.624608 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 15:57:55.641568328 +0000 UTC Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.723425 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.724624 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.724666 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.724687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:02 crc kubenswrapper[4546]: I0201 06:43:02.747458 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 01 06:43:03 crc kubenswrapper[4546]: I0201 06:43:03.624895 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:17:09.226351973 +0000 UTC Feb 01 06:43:03 crc kubenswrapper[4546]: I0201 06:43:03.725667 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:03 crc kubenswrapper[4546]: I0201 06:43:03.726931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:03 crc kubenswrapper[4546]: I0201 06:43:03.726966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:03 crc kubenswrapper[4546]: I0201 06:43:03.726977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.430288 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.430668 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.431891 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.431928 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.431939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.434999 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.625622 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 03:10:54.028103672 +0000 UTC Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.727073 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.728027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.728167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:04 crc kubenswrapper[4546]: I0201 06:43:04.728254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:05 crc kubenswrapper[4546]: I0201 06:43:05.626163 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:59:07.235342885 +0000 UTC Feb 01 06:43:05 crc kubenswrapper[4546]: I0201 06:43:05.825195 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:43:05 crc kubenswrapper[4546]: I0201 06:43:05.835972 4546 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 06:43:06 crc kubenswrapper[4546]: I0201 06:43:06.618929 4546 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 06:43:06 crc kubenswrapper[4546]: I0201 06:43:06.627335 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:53:14.535147287 +0000 UTC Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.062020 4546 csr.go:261] certificate signing request csr-tkk64 is approved, waiting to be issued Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.077388 4546 csr.go:257] certificate signing request csr-tkk64 is issued Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.349576 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.351294 4546 trace.go:236] Trace[1178227068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:42:54.088) (total time: 13262ms): Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[1178227068]: ---"Objects listed" error: 13262ms (06:43:07.351) Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[1178227068]: [13.26272197s] [13.26272197s] END Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.351336 4546 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.351659 4546 trace.go:236] Trace[372945205]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:42:53.100) (total time: 14251ms): Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[372945205]: ---"Objects listed" error: 14251ms (06:43:07.351) Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[372945205]: [14.251544826s] [14.251544826s] END Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.351789 4546 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.352622 4546 trace.go:236] Trace[1296002459]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:42:52.894) (total time: 14457ms): Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[1296002459]: ---"Objects listed" error: 14457ms (06:43:07.352) Feb 01 06:43:07 crc kubenswrapper[4546]: Trace[1296002459]: [14.457843503s] [14.457843503s] END Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.352641 4546 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.353151 4546 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.354592 4546 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.374765 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.379083 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416170 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33894->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416217 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33894->192.168.126.11:17697: read: connection reset by peer" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416173 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33890->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416449 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33890->192.168.126.11:17697: read: connection reset by peer" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416792 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33908->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.416894 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33908->192.168.126.11:17697: read: connection reset by peer" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.417173 4546 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.417203 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.611992 4546 apiserver.go:52] "Watching apiserver" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.614605 4546 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.614888 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615224 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615302 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615395 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615438 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.615625 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615676 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.615938 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.615988 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.616094 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.617817 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.618310 4546 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.619345 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.619355 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.619400 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.620410 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.620500 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.620588 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.621769 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.624649 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.627547 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 12:10:25.711091392 +0000 UTC Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.650986 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655241 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655274 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655295 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655312 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655328 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655345 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655361 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655377 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655392 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655407 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655421 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655435 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655454 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655469 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655488 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655504 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655548 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655565 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655579 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655594 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655610 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655627 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655645 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655666 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655680 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655695 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655710 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655756 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655789 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655895 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.655911 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:08.155870694 +0000 UTC m=+18.806806711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.655984 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656025 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656053 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656073 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656090 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656109 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656132 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656154 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656173 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656193 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656212 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656233 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656250 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656267 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656289 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656363 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656382 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656402 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656423 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656442 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656458 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656474 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656491 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656510 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656535 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656553 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656569 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656575 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656586 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656669 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656698 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656730 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656725 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656765 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656775 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656749 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656795 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656913 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656952 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.656980 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657008 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657012 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657032 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657045 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657064 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657095 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657122 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657145 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657170 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657194 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657214 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657217 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657259 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657281 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657302 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657322 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657345 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657361 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657384 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657404 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657426 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657445 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657460 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657464 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657536 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.657767 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658130 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658147 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658171 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658214 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658236 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658258 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658296 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658349 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658373 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658399 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658422 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658448 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658472 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658495 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658527 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658552 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658576 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658599 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658687 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658722 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658775 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658799 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658823 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658848 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658885 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658908 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658933 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658957 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.658980 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659001 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659047 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659073 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659096 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659118 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659143 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659171 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659192 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659215 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659236 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659256 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659277 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659297 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659319 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659340 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659362 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659386 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659395 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659409 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659464 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659487 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659512 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659544 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659564 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659584 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659605 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659625 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659644 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659660 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659678 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659696 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659716 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659731 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659754 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659777 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659797 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659816 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659840 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659878 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659897 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659915 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659931 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659970 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.659986 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660004 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660021 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660038 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660059 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660077 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660098 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660114 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660132 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660150 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660168 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660186 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660203 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660222 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660239 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660254 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660272 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660289 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660307 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660759 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.660762 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.661204 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.661590 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.661968 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662198 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662399 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662429 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662449 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662469 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662491 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662508 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662539 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662558 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662577 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662770 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662877 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.662997 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663027 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663047 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663040 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663065 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663243 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663316 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663344 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663364 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663388 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663377 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663401 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663407 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663410 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663490 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663717 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663819 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663877 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.663981 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664039 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664105 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664119 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664127 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664127 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664307 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664533 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664549 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664678 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664796 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664834 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.664915 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665088 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665167 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665243 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665630 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665690 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.665867 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666141 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666231 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666377 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666427 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666584 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666681 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666811 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666846 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.666987 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.667485 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.667634 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.667719 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668000 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668051 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668101 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668396 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668423 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668459 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668723 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668819 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.668965 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669032 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669111 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669505 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669540 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669825 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.669837 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670163 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670211 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670249 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670303 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670557 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671005 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671412 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671729 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.672048 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671949 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.672240 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.672541 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671626 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.672872 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.670743 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.671748 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673164 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673194 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673222 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673243 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673266 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673309 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673357 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673382 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673403 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673424 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673448 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673486 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673579 4546 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673591 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673603 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673614 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673624 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673635 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673645 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673656 4546 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673666 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673676 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673685 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673693 4546 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673702 4546 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673711 4546 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673720 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673728 4546 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673736 4546 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673745 4546 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673754 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673762 4546 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673772 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673781 4546 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673791 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673799 4546 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673807 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673820 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673829 4546 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673837 4546 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673846 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673932 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673944 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673955 4546 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673967 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673981 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.673990 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674000 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674010 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674018 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674027 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674035 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674043 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674051 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674060 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674068 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674076 4546 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674085 4546 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674092 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674101 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674108 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674105 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674117 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.674167 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674180 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674199 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.674217 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:08.174202275 +0000 UTC m=+18.825138291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674234 4546 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674245 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674254 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674264 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674317 4546 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674327 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674335 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674343 4546 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674351 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674360 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674368 4546 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674377 4546 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674385 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674393 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674402 4546 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674410 4546 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674418 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674426 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674438 4546 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674447 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674459 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674469 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674478 4546 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674487 4546 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674495 4546 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674505 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674523 4546 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674538 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674548 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674556 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674565 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674574 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674585 4546 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674594 4546 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674510 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.672239 4546 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674758 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.674763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.675306 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.675443 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.675892 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.675990 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.677296 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.678713 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.679084 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.679310 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.679732 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.679791 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.680069 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.680219 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.680228 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.680728 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.681111 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.681833 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.682068 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.682291 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.682618 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.682623 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.682960 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.683254 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.683342 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.683410 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.683642 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.683845 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.684358 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.684418 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:08.184389117 +0000 UTC m=+18.835325133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.685342 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.685660 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.685843 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.686105 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.686230 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.686396 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.686836 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.687135 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.687367 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.687365 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.687625 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.687911 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688129 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688358 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688549 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688711 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688902 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.688921 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.689107 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.689309 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.689659 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.690078 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.690443 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.690893 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691160 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691391 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691600 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691619 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691818 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.691821 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.692045 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.692327 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.692570 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.692580 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.692775 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.693071 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.693387 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.694756 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.695333 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.695642 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.696157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.697625 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.697880 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.698073 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.698371 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.698504 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.698808 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.698826 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.698866 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.698908 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:08.198898127 +0000 UTC m=+18.849834143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.698925 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.699157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.699324 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700021 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700187 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700338 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700484 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700494 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700909 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.700943 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.701289 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702055 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702247 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702253 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702458 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702551 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.702785 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.703061 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.704359 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.704424 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.705288 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.705319 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.705574 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.705986 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.706712 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.706744 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.706763 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.706827 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:08.206808982 +0000 UTC m=+18.857744987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.707352 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.707837 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.712957 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.713323 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.714105 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.714186 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.714371 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.714607 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.715151 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.715164 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.715717 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.716134 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.717051 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.718679 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.719188 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.719553 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.720535 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.720688 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.720717 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.721553 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.721979 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.721074 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.722664 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.723497 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.724225 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.725878 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.726039 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.727838 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.728461 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.730408 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.731170 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.732016 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.733496 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.735879 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.736883 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.739747 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.739949 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.745447 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.746403 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab" exitCode=255 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.746930 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab"} Feb 01 06:43:07 crc kubenswrapper[4546]: E0201 06:43:07.750745 4546 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.756157 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.759208 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.759459 4546 scope.go:117] "RemoveContainer" containerID="9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.765231 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.775721 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.775825 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.775950 4546 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776037 4546 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776107 4546 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776170 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776248 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776318 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776400 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776454 4546 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776528 4546 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776596 4546 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776645 4546 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776713 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776784 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776834 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776916 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.776985 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777034 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777111 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777163 4546 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777233 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777300 4546 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777348 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777422 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777491 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777569 4546 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777646 4546 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777699 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777767 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.777833 4546 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778024 4546 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778093 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778142 4546 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778210 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778258 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778327 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778374 4546 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778434 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778485 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778558 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778629 4546 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778684 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778754 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778802 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778880 4546 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.778956 4546 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779007 4546 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779078 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779150 4546 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779225 4546 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779273 4546 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779341 4546 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779394 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779464 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779544 4546 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779614 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779668 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779735 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779792 4546 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779873 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779934 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.779999 4546 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780053 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780102 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780151 4546 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780219 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780270 4546 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780314 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780364 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780416 4546 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780465 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780519 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780573 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780644 4546 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780695 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780749 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780798 4546 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780842 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780906 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.780961 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781011 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781062 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781113 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781158 4546 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781202 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781252 4546 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781323 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781379 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781425 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781476 4546 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781533 4546 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781585 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781635 4546 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781685 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781744 4546 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781790 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781834 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781896 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.781955 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782002 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782046 4546 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782092 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782137 4546 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782187 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782233 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782281 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782333 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782383 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782432 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.782532 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.784747 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.785088 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.803713 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.814025 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.833427 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.849278 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.930417 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.938977 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:43:07 crc kubenswrapper[4546]: W0201 06:43:07.940731 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7ae016b84707b186ac0df86dd1ea115835df90fc20affbc9529d879e404e81c2 WatchSource:0}: Error finding container 7ae016b84707b186ac0df86dd1ea115835df90fc20affbc9529d879e404e81c2: Status 404 returned error can't find the container with id 7ae016b84707b186ac0df86dd1ea115835df90fc20affbc9529d879e404e81c2 Feb 01 06:43:07 crc kubenswrapper[4546]: I0201 06:43:07.944722 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:43:07 crc kubenswrapper[4546]: W0201 06:43:07.963469 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-069464b8abfe8503f5fbdae784d2b57f61b6f5b55e2320edffafb9a563b69c04 WatchSource:0}: Error finding container 069464b8abfe8503f5fbdae784d2b57f61b6f5b55e2320edffafb9a563b69c04: Status 404 returned error can't find the container with id 069464b8abfe8503f5fbdae784d2b57f61b6f5b55e2320edffafb9a563b69c04 Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.078542 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-01 06:38:07 +0000 UTC, rotation deadline is 2026-11-30 12:40:46.955152127 +0000 UTC Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.078617 4546 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7253h57m38.876537943s for next certificate rotation Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.189648 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.189617 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:09.189596167 +0000 UTC m=+19.840532184 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.189870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.189943 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.189975 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.190019 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:09.190009364 +0000 UTC m=+19.840945380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.190077 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.190158 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:09.190140106 +0000 UTC m=+19.841076122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.290750 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.290826 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.290950 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.290976 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.290988 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.291030 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.291059 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.291071 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.291041 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:09.29102806 +0000 UTC m=+19.941964076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:08 crc kubenswrapper[4546]: E0201 06:43:08.291160 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:09.291121463 +0000 UTC m=+19.942057479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.506214 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c4gpz"] Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.506641 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.509090 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.509866 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.510308 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.533078 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.570475 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.586889 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.592775 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73cf3878-2b3f-4ac6-b698-c86ac72baa90-hosts-file\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.592941 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4xk5\" (UniqueName: \"kubernetes.io/projected/73cf3878-2b3f-4ac6-b698-c86ac72baa90-kube-api-access-n4xk5\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.614029 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.628431 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 22:47:31.66957636 +0000 UTC Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.630139 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.646940 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.671521 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.686003 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.694045 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73cf3878-2b3f-4ac6-b698-c86ac72baa90-hosts-file\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.694151 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4xk5\" (UniqueName: \"kubernetes.io/projected/73cf3878-2b3f-4ac6-b698-c86ac72baa90-kube-api-access-n4xk5\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.694185 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/73cf3878-2b3f-4ac6-b698-c86ac72baa90-hosts-file\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.701532 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.722400 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4xk5\" (UniqueName: \"kubernetes.io/projected/73cf3878-2b3f-4ac6-b698-c86ac72baa90-kube-api-access-n4xk5\") pod \"node-resolver-c4gpz\" (UID: \"73cf3878-2b3f-4ac6-b698-c86ac72baa90\") " pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.749954 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"069464b8abfe8503f5fbdae784d2b57f61b6f5b55e2320edffafb9a563b69c04"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.751624 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.751665 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.751677 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b19af1e273a4c83a8d02fba3ab724d57ae182e63f0a7e55f115fd803350cb24d"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.752982 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.753022 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7ae016b84707b186ac0df86dd1ea115835df90fc20affbc9529d879e404e81c2"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.755000 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.756492 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92"} Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.756887 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.770798 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.815664 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.817749 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c4gpz" Feb 01 06:43:08 crc kubenswrapper[4546]: W0201 06:43:08.828920 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73cf3878_2b3f_4ac6_b698_c86ac72baa90.slice/crio-7011bb7707a8229928a5ed3651b31b1758b74e4315527d164d6d4f9445dbcc53 WatchSource:0}: Error finding container 7011bb7707a8229928a5ed3651b31b1758b74e4315527d164d6d4f9445dbcc53: Status 404 returned error can't find the container with id 7011bb7707a8229928a5ed3651b31b1758b74e4315527d164d6d4f9445dbcc53 Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.849881 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.862364 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.881075 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.910923 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dwtsx"] Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.911310 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nwmnb"] Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.911307 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.911435 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.911811 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.915108 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.915898 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916363 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916375 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916548 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916621 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916682 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916962 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.916989 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.917272 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.917665 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mj5bf"] Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.918298 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.922851 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.924333 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.955647 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996336 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfgb\" (UniqueName: \"kubernetes.io/projected/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-kube-api-access-6wfgb\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996382 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-conf-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996433 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-cnibin\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996455 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996516 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-system-cni-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996548 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-daemon-config\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996567 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-hostroot\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996586 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a4316448-1833-40f9-bdd7-e13d7dd4da6b-rootfs\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996602 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4316448-1833-40f9-bdd7-e13d7dd4da6b-proxy-tls\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996621 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-multus\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-bin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996656 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-os-release\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996674 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-k8s-cni-cncf-io\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996692 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-multus-certs\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996709 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh4jc\" (UniqueName: \"kubernetes.io/projected/218c5efd-c97f-48e1-883f-ec381e0a559b-kube-api-access-zh4jc\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996727 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4316448-1833-40f9-bdd7-e13d7dd4da6b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996746 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-netns\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996763 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-etc-kubernetes\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996791 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxv6k\" (UniqueName: \"kubernetes.io/projected/a4316448-1833-40f9-bdd7-e13d7dd4da6b-kube-api-access-wxv6k\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996806 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996823 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cni-binary-copy\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996846 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-socket-dir-parent\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996878 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-kubelet\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996897 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.996988 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cnibin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.997066 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.997162 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-system-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:08 crc kubenswrapper[4546]: I0201 06:43:08.997204 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-os-release\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:08.998183 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.013010 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.032083 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.050631 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.060308 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.070283 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.092574 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097490 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-conf-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097523 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-cnibin\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097543 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097577 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-system-cni-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097613 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-conf-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097638 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-cnibin\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097628 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-daemon-config\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097689 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-system-cni-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097722 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-hostroot\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097747 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a4316448-1833-40f9-bdd7-e13d7dd4da6b-rootfs\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097768 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4316448-1833-40f9-bdd7-e13d7dd4da6b-proxy-tls\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097792 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-multus\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097816 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-bin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097843 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-os-release\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097879 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-k8s-cni-cncf-io\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097901 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-multus-certs\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097923 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh4jc\" (UniqueName: \"kubernetes.io/projected/218c5efd-c97f-48e1-883f-ec381e0a559b-kube-api-access-zh4jc\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097947 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4316448-1833-40f9-bdd7-e13d7dd4da6b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097964 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-netns\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.097987 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-etc-kubernetes\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098023 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxv6k\" (UniqueName: \"kubernetes.io/projected/a4316448-1833-40f9-bdd7-e13d7dd4da6b-kube-api-access-wxv6k\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098044 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098064 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cni-binary-copy\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098133 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-socket-dir-parent\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098151 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-kubelet\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098171 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098193 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cnibin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098210 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098234 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-system-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098256 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-os-release\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098293 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfgb\" (UniqueName: \"kubernetes.io/projected/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-kube-api-access-6wfgb\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098303 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-daemon-config\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098328 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098357 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-hostroot\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098392 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a4316448-1833-40f9-bdd7-e13d7dd4da6b-rootfs\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098573 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-socket-dir-parent\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098593 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-system-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098640 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-multus-certs\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098655 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-multus-cni-dir\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098582 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-kubelet\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098691 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-bin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098766 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-var-lib-cni-multus\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098793 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-k8s-cni-cncf-io\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098818 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-host-run-netns\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098943 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-os-release\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098979 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-etc-kubernetes\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098949 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-os-release\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.098994 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cni-binary-copy\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.099011 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-cnibin\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.099099 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/218c5efd-c97f-48e1-883f-ec381e0a559b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.099351 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4316448-1833-40f9-bdd7-e13d7dd4da6b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.099828 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/218c5efd-c97f-48e1-883f-ec381e0a559b-cni-binary-copy\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.104264 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4316448-1833-40f9-bdd7-e13d7dd4da6b-proxy-tls\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.112487 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.115541 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfgb\" (UniqueName: \"kubernetes.io/projected/95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16-kube-api-access-6wfgb\") pod \"multus-nwmnb\" (UID: \"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\") " pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.116039 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh4jc\" (UniqueName: \"kubernetes.io/projected/218c5efd-c97f-48e1-883f-ec381e0a559b-kube-api-access-zh4jc\") pod \"multus-additional-cni-plugins-mj5bf\" (UID: \"218c5efd-c97f-48e1-883f-ec381e0a559b\") " pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.121291 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxv6k\" (UniqueName: \"kubernetes.io/projected/a4316448-1833-40f9-bdd7-e13d7dd4da6b-kube-api-access-wxv6k\") pod \"machine-config-daemon-dwtsx\" (UID: \"a4316448-1833-40f9-bdd7-e13d7dd4da6b\") " pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.128571 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.139888 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.148545 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.157335 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.165919 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.174913 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.199177 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.199298 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.199328 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.199420 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.199478 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:11.199455532 +0000 UTC m=+21.850391548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.199751 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:11.199742073 +0000 UTC m=+21.850678089 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.199792 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.199815 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:11.199809047 +0000 UTC m=+21.850745062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.242798 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.248679 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwmnb" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.270294 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.288183 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218c5efd_c97f_48e1_883f_ec381e0a559b.slice/crio-3b1f19c6ea34d6f25e54166cc7ea95fb6f8e32fec35a6d43369ddd50cb713afa WatchSource:0}: Error finding container 3b1f19c6ea34d6f25e54166cc7ea95fb6f8e32fec35a6d43369ddd50cb713afa: Status 404 returned error can't find the container with id 3b1f19c6ea34d6f25e54166cc7ea95fb6f8e32fec35a6d43369ddd50cb713afa Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.300187 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.300223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300333 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300352 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300364 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300364 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300381 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300393 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300410 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:11.300398047 +0000 UTC m=+21.951334063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.300432 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:11.300421371 +0000 UTC m=+21.951357387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.344297 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4klz2"] Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.345081 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.346478 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.346713 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.351009 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.351037 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.351383 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.351518 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.351584 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.365982 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.380283 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.394683 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.409095 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.420838 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.434637 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.445986 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.454255 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.464336 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.477011 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.487261 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.495991 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504109 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504196 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504287 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504361 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504415 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g65h\" (UniqueName: \"kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504477 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504535 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504610 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504679 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504731 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504786 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504867 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.504944 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505009 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505070 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505141 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505214 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505270 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505321 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.505370 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.507254 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.549691 4546 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.549943 4546 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550011 4546 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550097 4546 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550134 4546 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550154 4546 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550171 4546 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550191 4546 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550212 4546 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550240 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550258 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550291 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550303 4546 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550338 4546 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550453 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550476 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550731 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.550787 4546 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.551109 4546 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.551265 4546 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605785 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605823 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605840 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605872 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605887 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605903 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605918 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605946 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605964 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.605981 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606003 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606018 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606042 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606058 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606073 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606094 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606109 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g65h\" (UniqueName: \"kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606124 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606140 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606163 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606211 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606252 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606272 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606665 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606742 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606772 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606798 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606934 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606969 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.606978 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607050 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607058 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607182 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607181 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607199 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607357 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607377 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.607505 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.610079 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.623715 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g65h\" (UniqueName: \"kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h\") pod \"ovnkube-node-4klz2\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.629458 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 17:34:52.233322073 +0000 UTC Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.654851 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.654903 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.654981 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.655150 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.655271 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:09 crc kubenswrapper[4546]: E0201 06:43:09.655382 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.658073 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.658563 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.659341 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.659381 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.659980 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.660534 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.661902 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.662507 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.663100 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.664091 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.664600 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.665491 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.666153 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.666956 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.667443 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.668281 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.668885 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.669263 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: W0201 06:43:09.670232 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4014c65_cdc3_4e2d_a7c3_2ac94248d488.slice/crio-b971ac7f229b44da93e305c9ae68ebcfe0d1f79ff970693247e95d72aef3bbda WatchSource:0}: Error finding container b971ac7f229b44da93e305c9ae68ebcfe0d1f79ff970693247e95d72aef3bbda: Status 404 returned error can't find the container with id b971ac7f229b44da93e305c9ae68ebcfe0d1f79ff970693247e95d72aef3bbda Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.670287 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.670798 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.671479 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.671836 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.672306 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.672998 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.673397 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.673998 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.674701 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.675960 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.676661 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.677580 4546 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.677678 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.679178 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.680167 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.680652 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.681298 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.682330 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.683142 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.692070 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.709755 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.740116 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.753650 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.766192 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.767146 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.777257 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerStarted","Data":"bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.777554 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerStarted","Data":"d27aa9baf37ec5c930306162e415631546b9760636e4e8a0ecbd390d2cac2f40"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.779036 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"b971ac7f229b44da93e305c9ae68ebcfe0d1f79ff970693247e95d72aef3bbda"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.780530 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2" exitCode=0 Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.780630 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.780791 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerStarted","Data":"3b1f19c6ea34d6f25e54166cc7ea95fb6f8e32fec35a6d43369ddd50cb713afa"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.783634 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.783710 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.783764 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"bdf18a6c2b579c7882ef9fa131d532ed4718b1b2f43ab2c46dd2f7a17996a8ab"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.789548 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c4gpz" event={"ID":"73cf3878-2b3f-4ac6-b698-c86ac72baa90","Type":"ContainerStarted","Data":"812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.789582 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c4gpz" event={"ID":"73cf3878-2b3f-4ac6-b698-c86ac72baa90","Type":"ContainerStarted","Data":"7011bb7707a8229928a5ed3651b31b1758b74e4315527d164d6d4f9445dbcc53"} Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.792323 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.821842 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.849522 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.862848 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.882459 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.894012 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.913468 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.923054 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.933261 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.941302 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.953070 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.964551 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.978046 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:09 crc kubenswrapper[4546]: I0201 06:43:09.990050 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.003641 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.016782 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.029723 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.039832 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.062084 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.079495 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.391895 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.453754 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.460717 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.555469 4546 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.557428 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.557554 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.557622 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.557736 4546 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.564405 4546 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.564645 4546 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.565382 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.565406 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.565413 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.565426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.565435 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.570939 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.579365 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.582268 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.582294 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.582302 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.582312 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.582322 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.592996 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.595430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.595479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.595489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.595499 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.595505 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.598953 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.604045 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.606295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.606323 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.606330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.606341 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.606352 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.614779 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.617326 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.617349 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.617357 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.617369 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.617377 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.625848 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: E0201 06:43:10.625968 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.627190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.627216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.627231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.627241 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.627250 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.630543 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 14:03:15.368743983 +0000 UTC Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.632495 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.707426 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.729012 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.729048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.729060 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.729074 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.729084 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.734137 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.751448 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.778363 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.794041 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" exitCode=0 Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.794072 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.797051 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5" exitCode=0 Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.797143 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.811376 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.827663 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.832314 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.832367 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.832379 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.832427 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.832447 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.838972 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.849565 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.859578 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.867592 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.877352 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.887934 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.906363 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.918798 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.929825 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.934799 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.934836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.934848 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.934881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.934894 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:10Z","lastTransitionTime":"2026-02-01T06:43:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.937613 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.948983 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.959693 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.963607 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.993267 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:10 crc kubenswrapper[4546]: I0201 06:43:10.994550 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.008235 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.014877 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.023117 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.031136 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.036966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.036987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.036995 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.037020 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.037031 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.045253 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.046400 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.049657 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.059318 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.068369 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.084090 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.125538 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.139147 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.139176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.139187 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.139204 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.139215 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.153946 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.183813 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.193967 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.221242 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.221312 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.221365 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.221472 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.221516 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:15.221505203 +0000 UTC m=+25.872441219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.221573 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:15.22156764 +0000 UTC m=+25.872503646 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.221635 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.221660 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:15.221653953 +0000 UTC m=+25.872589969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241550 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241635 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241650 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.241790 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.283635 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.319598 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.322034 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.322124 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322219 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322254 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322232 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322267 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322276 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322288 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322309 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:15.322296596 +0000 UTC m=+25.973232612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.322324 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:15.322318076 +0000 UTC m=+25.973254093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.344267 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.344296 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.344306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.344351 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.344363 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.361210 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.445723 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.445757 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.445767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.445785 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.445796 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.547687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.547817 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.547903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.548072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.548249 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.631497 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:55:12.032263189 +0000 UTC Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655350 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655356 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655528 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655676 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655702 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655739 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.655723 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.655756 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.656051 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:11 crc kubenswrapper[4546]: E0201 06:43:11.656947 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.758586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.758631 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.758641 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.758655 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.758665 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.804889 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.805315 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.805337 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.805416 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.805430 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.805451 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.807146 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497" exitCode=0 Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.807242 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.822632 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.833741 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.846353 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.857843 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.861324 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.861351 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.861361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.861375 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.861383 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.870218 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.878740 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.886298 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.894271 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.903217 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.918818 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.928067 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.936635 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.946033 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:11Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.963772 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.963814 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.963828 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.963847 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:11 crc kubenswrapper[4546]: I0201 06:43:11.963878 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:11Z","lastTransitionTime":"2026-02-01T06:43:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.066027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.066070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.066080 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.066096 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.066107 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.168896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.168928 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.168940 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.168959 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.168969 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.271284 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.271337 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.271352 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.271374 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.271387 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.373424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.373469 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.373480 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.373496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.373508 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.475605 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.475640 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.475653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.475670 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.475682 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.578251 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.578285 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.578295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.578309 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.578322 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.632595 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 17:08:37.918045585 +0000 UTC Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.680429 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.680510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.680527 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.680549 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.680565 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.782847 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.782901 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.782923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.782947 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.782959 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.813814 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b" exitCode=0 Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.813898 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.835779 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.848367 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.861163 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.872211 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.881172 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.885155 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.885191 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.885201 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.885218 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.885232 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.891878 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.901808 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.918952 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.929110 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.940833 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.951637 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.961202 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.970010 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.987450 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.987500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.987510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.987523 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:12 crc kubenswrapper[4546]: I0201 06:43:12.987534 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:12Z","lastTransitionTime":"2026-02-01T06:43:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.089910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.089966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.089980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.090004 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.090018 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.192439 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.192491 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.192503 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.192520 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.192532 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.294669 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.294698 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.294707 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.294723 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.294735 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.396372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.396412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.396424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.396439 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.396455 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.498217 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.498257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.498267 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.498284 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.498296 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.600434 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.600477 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.600486 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.600500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.600510 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.633320 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 21:29:19.307938012 +0000 UTC Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.654900 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.654942 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:13 crc kubenswrapper[4546]: E0201 06:43:13.655123 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.654947 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:13 crc kubenswrapper[4546]: E0201 06:43:13.655313 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:13 crc kubenswrapper[4546]: E0201 06:43:13.655465 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.702595 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.702686 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.702749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.702815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.702896 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.804900 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.805001 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.805062 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.805128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.805181 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.820624 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.823735 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab" exitCode=0 Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.823767 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.840397 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.850509 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.861370 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.873151 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.884629 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.898383 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.906942 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.906980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.906991 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.907009 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.907021 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:13Z","lastTransitionTime":"2026-02-01T06:43:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.907264 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.916726 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.925091 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.932618 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.943324 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.953456 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:13 crc kubenswrapper[4546]: I0201 06:43:13.966373 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:13Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.010790 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.011056 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.011069 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.011091 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.011103 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.113155 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.113192 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.113204 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.113226 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.113239 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.215058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.215090 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.215101 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.215122 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.215137 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.316650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.316681 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.316690 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.316706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.316718 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.418709 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.418748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.418760 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.418781 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.418794 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.520479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.520518 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.520529 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.520551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.520565 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.623122 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.623154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.623165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.623179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.623193 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.634313 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:31:41.221024105 +0000 UTC Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.725627 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.725666 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.725676 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.725707 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.725719 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.827597 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.827631 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.827640 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.827654 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.827666 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.830487 4546 generic.go:334] "Generic (PLEG): container finished" podID="218c5efd-c97f-48e1-883f-ec381e0a559b" containerID="d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c" exitCode=0 Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.830526 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerDied","Data":"d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.846142 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.860908 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.875218 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.884512 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.894292 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.902311 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.913055 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.928489 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.933199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.933230 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.933242 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.933257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.933269 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:14Z","lastTransitionTime":"2026-02-01T06:43:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.944838 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.956661 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.966347 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.978069 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:14 crc kubenswrapper[4546]: I0201 06:43:14.987349 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:14Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.035767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.035802 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.035814 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.035832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.035846 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.138768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.138817 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.138828 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.138849 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.138879 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.229882 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fxcn7"] Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.230353 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.232569 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.233571 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.233791 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.233996 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.241390 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.241420 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.241432 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.241458 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.241471 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.245212 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.255309 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.257030 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.257163 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.257204 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.257184794 +0000 UTC m=+33.908120810 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.257271 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.257334 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.257317756 +0000 UTC m=+33.908253772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.257272 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.257383 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.257435 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.257426891 +0000 UTC m=+33.908362907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.265577 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.276723 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.284666 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.294465 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.304115 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.313892 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.324722 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.334087 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.341786 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.343381 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.343422 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.343432 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.343459 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.343470 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.351063 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.358082 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-host\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.358124 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-serviceca\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.358161 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.358216 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.358267 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-kube-api-access-4b5pw\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358333 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358360 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358376 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358426 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.35841186 +0000 UTC m=+34.009347876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358435 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358474 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358492 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.358556 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.358533579 +0000 UTC m=+34.009469595 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.360656 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.374729 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.445502 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.445569 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.445583 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.445610 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.445630 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.459185 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-kube-api-access-4b5pw\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.459229 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-host\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.459506 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-serviceca\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.459934 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-host\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.465737 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-serviceca\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.479089 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/62d4004d-9bf8-4b57-9193-4a8ad5aa3977-kube-api-access-4b5pw\") pod \"node-ca-fxcn7\" (UID: \"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\") " pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.542193 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxcn7" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.547500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.547545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.547564 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.547583 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.547595 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: W0201 06:43:15.551264 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d4004d_9bf8_4b57_9193_4a8ad5aa3977.slice/crio-58d866f3d973fac17ee90e3c79c32b4609eccf54a0d153b5838af029f3e1bc1d WatchSource:0}: Error finding container 58d866f3d973fac17ee90e3c79c32b4609eccf54a0d153b5838af029f3e1bc1d: Status 404 returned error can't find the container with id 58d866f3d973fac17ee90e3c79c32b4609eccf54a0d153b5838af029f3e1bc1d Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.635198 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 20:34:45.091200231 +0000 UTC Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.649637 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.649663 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.649673 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.649688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.649699 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.654073 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.654136 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.654245 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.654236 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.654342 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:15 crc kubenswrapper[4546]: E0201 06:43:15.654430 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.751832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.751896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.751910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.751926 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.751941 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.840084 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.840412 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.840464 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.848049 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" event={"ID":"218c5efd-c97f-48e1-883f-ec381e0a559b","Type":"ContainerStarted","Data":"4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.849499 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxcn7" event={"ID":"62d4004d-9bf8-4b57-9193-4a8ad5aa3977","Type":"ContainerStarted","Data":"2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.849528 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxcn7" event={"ID":"62d4004d-9bf8-4b57-9193-4a8ad5aa3977","Type":"ContainerStarted","Data":"58d866f3d973fac17ee90e3c79c32b4609eccf54a0d153b5838af029f3e1bc1d"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.854231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.854262 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.854273 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.854289 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.854302 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.856180 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.862079 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.865095 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.868454 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.878902 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.896683 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.908032 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.918748 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.928992 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.937187 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.946093 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.955335 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.956684 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.956715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.956725 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.956741 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.956750 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:15Z","lastTransitionTime":"2026-02-01T06:43:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.966743 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.977094 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.987351 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:15 crc kubenswrapper[4546]: I0201 06:43:15.994421 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:15Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.001578 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.014847 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.022919 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.033769 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.043005 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.051401 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.058377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.058402 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.058412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.058426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.058435 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.062840 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.077415 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.092411 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.113834 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.125357 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.134772 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.142722 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.150470 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.160925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.160957 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.160966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.160978 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.160986 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.263381 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.263424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.263433 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.263446 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.263467 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.365172 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.365230 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.365240 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.365256 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.365265 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.467250 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.467295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.467304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.467321 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.467334 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.569729 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.569762 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.569773 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.569787 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.569795 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.635719 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 02:25:55.849120893 +0000 UTC Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.671878 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.671900 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.671910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.671924 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.671935 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.774550 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.774598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.774606 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.774620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.774629 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.852594 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.877008 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.877041 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.877050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.877066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.877075 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.979389 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.979417 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.979425 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.979438 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:16 crc kubenswrapper[4546]: I0201 06:43:16.979446 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:16Z","lastTransitionTime":"2026-02-01T06:43:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.081653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.081692 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.081700 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.081715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.081725 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.184039 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.184073 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.184081 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.184094 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.184106 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.286121 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.286160 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.286168 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.286182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.286192 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.388489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.388524 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.388534 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.388549 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.388559 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.490540 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.490585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.490598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.490613 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.490623 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.592707 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.592758 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.592769 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.592798 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.592812 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.636080 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 15:59:57.460679488 +0000 UTC Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.654491 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.654545 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.654492 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:17 crc kubenswrapper[4546]: E0201 06:43:17.654607 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:17 crc kubenswrapper[4546]: E0201 06:43:17.654787 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:17 crc kubenswrapper[4546]: E0201 06:43:17.654935 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.695255 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.695294 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.695304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.695319 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.695332 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.798729 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.798779 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.798791 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.798810 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.798825 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.856355 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/0.log" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.859158 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca" exitCode=1 Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.859192 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.859769 4546 scope.go:117] "RemoveContainer" containerID="62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.871095 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.881035 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.895768 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.902009 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.902035 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.902045 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.902059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.902070 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:17Z","lastTransitionTime":"2026-02-01T06:43:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.910285 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.921789 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.935375 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.949281 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.957691 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.969102 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:17 crc kubenswrapper[4546]: I0201 06:43:17.979662 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.001411 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:17Z\\\",\\\"message\\\":\\\"r removal\\\\nI0201 06:43:17.459477 5721 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:43:17.459485 5721 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:17.459493 5721 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:43:17.459538 5721 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0201 06:43:17.459545 5721 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0201 06:43:17.459610 5721 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0201 06:43:17.459886 5721 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:43:17.459897 5721 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:17.459899 5721 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0201 06:43:17.459902 5721 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:43:17.459912 5721 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0201 06:43:17.459932 5721 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:43:17.459960 5721 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:43:17.459982 5721 factory.go:656] Stopping watch factory\\\\nI0201 06:43:17.459992 5721 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:17.459963 5721 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:17Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.003949 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.003992 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.004005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.004029 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.004060 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.012989 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.022534 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.031117 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.106765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.106795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.106806 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.106823 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.106836 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.208369 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.208431 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.208439 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.208465 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.208476 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.310831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.310893 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.310904 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.310915 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.310923 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.414207 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.414251 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.414261 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.414278 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.414289 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.516960 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.517023 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.517037 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.517065 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.517084 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.619312 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.619362 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.619375 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.619395 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.619413 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.636433 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:54:47.308478533 +0000 UTC Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.722160 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.722205 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.722216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.722233 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.722245 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.824888 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.824950 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.824963 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.824988 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.825004 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.863979 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/1.log" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.864672 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/0.log" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.867741 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36" exitCode=1 Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.867790 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.867847 4546 scope.go:117] "RemoveContainer" containerID="62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.868597 4546 scope.go:117] "RemoveContainer" containerID="7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36" Feb 01 06:43:18 crc kubenswrapper[4546]: E0201 06:43:18.868824 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.885448 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.897462 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.910395 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.922649 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.928072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.928113 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.928123 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.928140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.928152 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:18Z","lastTransitionTime":"2026-02-01T06:43:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.933231 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.941188 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.950671 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.960660 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.975940 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:17Z\\\",\\\"message\\\":\\\"r removal\\\\nI0201 06:43:17.459477 5721 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:43:17.459485 5721 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:17.459493 5721 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:43:17.459538 5721 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0201 06:43:17.459545 5721 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0201 06:43:17.459610 5721 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0201 06:43:17.459886 5721 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:43:17.459897 5721 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:17.459899 5721 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0201 06:43:17.459902 5721 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:43:17.459912 5721 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0201 06:43:17.459932 5721 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:43:17.459960 5721 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:43:17.459982 5721 factory.go:656] Stopping watch factory\\\\nI0201 06:43:17.459992 5721 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:17.459963 5721 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.987039 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:18 crc kubenswrapper[4546]: I0201 06:43:18.996610 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.007417 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.019947 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.028198 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.029909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.029951 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.029970 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.029987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.030000 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.132031 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.132068 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.132080 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.132100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.132114 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.234068 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.234094 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.234105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.234119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.234143 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.336264 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.336299 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.336310 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.336324 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.336333 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.437657 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.437714 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.437726 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.437740 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.437752 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.539898 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.539936 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.539949 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.539961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.539973 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.636614 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 15:03:06.664371165 +0000 UTC Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.642286 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.642307 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.642315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.642327 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.642339 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.654007 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.654079 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:19 crc kubenswrapper[4546]: E0201 06:43:19.654142 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:19 crc kubenswrapper[4546]: E0201 06:43:19.654211 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.654307 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:19 crc kubenswrapper[4546]: E0201 06:43:19.654390 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.665587 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.676288 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.685637 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.693263 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.700648 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.715668 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.730998 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b64bdd4831b65558d3df775203c00046dbe2ab0743a6c151cbea00743f4dca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:17Z\\\",\\\"message\\\":\\\"r removal\\\\nI0201 06:43:17.459477 5721 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:43:17.459485 5721 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:17.459493 5721 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:43:17.459538 5721 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0201 06:43:17.459545 5721 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0201 06:43:17.459610 5721 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0201 06:43:17.459886 5721 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:43:17.459897 5721 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:17.459899 5721 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0201 06:43:17.459902 5721 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:43:17.459912 5721 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0201 06:43:17.459932 5721 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:43:17.459960 5721 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:43:17.459982 5721 factory.go:656] Stopping watch factory\\\\nI0201 06:43:17.459992 5721 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:17.459963 5721 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.740028 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.744140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.744514 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.744525 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.744541 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.744559 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.751545 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.761228 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.771630 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.778485 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.787578 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.796828 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.846624 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.846784 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.846927 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.847014 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.847075 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.872752 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/1.log" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.876331 4546 scope.go:117] "RemoveContainer" containerID="7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36" Feb 01 06:43:19 crc kubenswrapper[4546]: E0201 06:43:19.876502 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.886490 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.895938 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.906452 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.921051 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.930175 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.939681 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948755 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948792 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948822 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948838 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948874 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.948847 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:19Z","lastTransitionTime":"2026-02-01T06:43:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.956354 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.964798 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.975349 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.989033 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:19 crc kubenswrapper[4546]: I0201 06:43:19.997164 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.008078 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.015400 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.051102 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.051138 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.051148 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.051164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.051176 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.153608 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.153641 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.153655 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.153671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.153681 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.256128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.256177 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.256186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.256199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.256208 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.357913 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.357946 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.357955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.357969 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.357978 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.460130 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.460157 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.460166 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.460179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.460188 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.562299 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.562328 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.562339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.562353 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.562362 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.637008 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 03:14:16.769748205 +0000 UTC Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.664327 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.664354 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.664363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.664373 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.664382 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.755844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.755977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.756036 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.756095 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.756148 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.764172 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.766921 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.766958 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.766968 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.766980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.766990 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.774899 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.777304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.777335 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.777372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.777385 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.777394 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.786686 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.788834 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.788892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.788905 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.788923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.788933 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.796828 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.799080 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.799099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.799107 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.799117 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.799124 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.807081 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:20 crc kubenswrapper[4546]: E0201 06:43:20.807189 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.808142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.808165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.808173 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.808183 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.808190 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.909352 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.909377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.909386 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.909397 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:20 crc kubenswrapper[4546]: I0201 06:43:20.909406 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:20Z","lastTransitionTime":"2026-02-01T06:43:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.011153 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.011187 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.011196 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.011209 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.011219 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.113002 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.113029 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.113037 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.113062 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.113072 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.214726 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.214751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.214767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.214779 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.214788 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.305628 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m"] Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.306168 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.308987 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.309265 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.316600 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.316633 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.316644 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.316656 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.316665 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.319669 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.328839 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.339571 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.351631 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.360573 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.369310 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.377871 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.387533 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.400724 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.410355 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.412217 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.412249 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.412280 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcbb8\" (UniqueName: \"kubernetes.io/projected/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-kube-api-access-fcbb8\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.412327 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.418748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.418925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.419093 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.419262 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.419412 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.420273 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.429340 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.438062 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.445712 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.456600 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.513007 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.513099 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.513218 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcbb8\" (UniqueName: \"kubernetes.io/projected/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-kube-api-access-fcbb8\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.513309 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.514091 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.514177 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.519767 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.521249 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.521274 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.521284 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.521302 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.521313 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.526545 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcbb8\" (UniqueName: \"kubernetes.io/projected/9032e2c3-caef-4e24-95a3-2d67a9a1e8c1-kube-api-access-fcbb8\") pod \"ovnkube-control-plane-749d76644c-z487m\" (UID: \"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.617397 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.623815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.623845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.623867 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.623889 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.623901 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: W0201 06:43:21.630914 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9032e2c3_caef_4e24_95a3_2d67a9a1e8c1.slice/crio-667f657cb031584008d2998b32f33499b5f45bf1245b140e78ee0798f8d9d1ad WatchSource:0}: Error finding container 667f657cb031584008d2998b32f33499b5f45bf1245b140e78ee0798f8d9d1ad: Status 404 returned error can't find the container with id 667f657cb031584008d2998b32f33499b5f45bf1245b140e78ee0798f8d9d1ad Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.638208 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 03:57:20.696214293 +0000 UTC Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.654206 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:21 crc kubenswrapper[4546]: E0201 06:43:21.654298 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.654583 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:21 crc kubenswrapper[4546]: E0201 06:43:21.654646 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.654775 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:21 crc kubenswrapper[4546]: E0201 06:43:21.654826 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.726086 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.726132 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.726144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.726165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.726183 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.828815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.828848 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.828887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.828902 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.828912 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.885119 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" event={"ID":"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1","Type":"ContainerStarted","Data":"d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.885170 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" event={"ID":"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1","Type":"ContainerStarted","Data":"e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.885185 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" event={"ID":"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1","Type":"ContainerStarted","Data":"667f657cb031584008d2998b32f33499b5f45bf1245b140e78ee0798f8d9d1ad"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.897203 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.906289 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.923205 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.930811 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.930921 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.931011 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.931085 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.931145 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:21Z","lastTransitionTime":"2026-02-01T06:43:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.941422 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.958085 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.969691 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.980481 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.989923 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:21 crc kubenswrapper[4546]: I0201 06:43:21.998093 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:21Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.007905 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.017671 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.026255 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.034083 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.034143 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.034159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.034179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.034196 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.036445 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.048101 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.057959 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.136795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.136849 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.136885 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.136906 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.136919 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.239605 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.239639 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.239650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.239664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.239675 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.342198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.342234 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.342245 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.342259 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.342271 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.367558 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8tdck"] Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.368155 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: E0201 06:43:22.368219 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.379968 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.387639 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.398077 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.413241 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.428475 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445195 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445232 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445208 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445242 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.445375 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.461975 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.482942 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.497954 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.508277 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.523241 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.523643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.523703 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9zzv\" (UniqueName: \"kubernetes.io/projected/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-kube-api-access-h9zzv\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.539605 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.547714 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.547751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.547762 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.547777 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.547788 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.551190 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.562148 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.571634 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.580618 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.624483 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.624554 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9zzv\" (UniqueName: \"kubernetes.io/projected/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-kube-api-access-h9zzv\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: E0201 06:43:22.624679 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:22 crc kubenswrapper[4546]: E0201 06:43:22.624754 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:23.124734678 +0000 UTC m=+33.775670704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.637994 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9zzv\" (UniqueName: \"kubernetes.io/projected/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-kube-api-access-h9zzv\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.638683 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:02:54.494080404 +0000 UTC Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.650749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.650846 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.650944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.651013 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.651071 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.753987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.754020 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.754031 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.754048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.754057 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.856024 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.856053 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.856065 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.856080 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.856092 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.958675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.958730 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.958745 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.958768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:22 crc kubenswrapper[4546]: I0201 06:43:22.958783 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:22Z","lastTransitionTime":"2026-02-01T06:43:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.060798 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.060828 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.060838 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.060849 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.060874 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.130679 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.130885 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.130957 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:24.130939225 +0000 UTC m=+34.781875240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.163638 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.163675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.163688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.163702 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.163713 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.265773 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.265805 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.265814 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.265825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.265834 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.332139 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.332277 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:43:39.332256892 +0000 UTC m=+49.983192908 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.332374 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.332418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.332515 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.332527 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.332572 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:39.332559933 +0000 UTC m=+49.983495949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.332597 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:39.332590761 +0000 UTC m=+49.983526777 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.367897 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.367948 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.367958 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.367971 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.367982 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.433299 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.433337 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433428 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433457 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433477 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433492 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433507 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433517 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433527 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:39.433512641 +0000 UTC m=+50.084448657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.433550 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:43:39.433539872 +0000 UTC m=+50.084475888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.469850 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.469909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.469920 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.469936 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.469946 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.572484 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.572520 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.572529 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.572543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.572553 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.639462 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:06:34.958568642 +0000 UTC Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.654851 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.654941 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.654955 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.655087 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.655178 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:23 crc kubenswrapper[4546]: E0201 06:43:23.655283 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.674923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.674949 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.674962 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.674973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.674985 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.777214 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.777243 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.777253 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.777266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.777276 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.878932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.879047 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.879123 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.879208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.879281 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.980879 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.980916 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.980925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.980941 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:23 crc kubenswrapper[4546]: I0201 06:43:23.980950 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:23Z","lastTransitionTime":"2026-02-01T06:43:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.082788 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.082808 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.082815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.082824 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.082833 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.139709 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:24 crc kubenswrapper[4546]: E0201 06:43:24.139839 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:24 crc kubenswrapper[4546]: E0201 06:43:24.139938 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:26.139920423 +0000 UTC m=+36.790856449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.184572 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.184611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.184620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.184633 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.184642 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.286839 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.286883 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.286892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.286905 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.286914 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.388977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.389009 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.389018 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.389034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.389044 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.490782 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.490835 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.490846 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.490878 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.490887 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.593176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.593214 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.593224 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.593236 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.593245 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.639993 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 17:34:50.217190291 +0000 UTC Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.654665 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:24 crc kubenswrapper[4546]: E0201 06:43:24.654762 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.694829 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.694875 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.694886 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.694901 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.694911 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.796594 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.796733 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.796815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.796937 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.797022 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.900002 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.900057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.900075 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.900098 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:24 crc kubenswrapper[4546]: I0201 06:43:24.900118 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:24Z","lastTransitionTime":"2026-02-01T06:43:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.002745 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.002795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.002811 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.002832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.002849 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.105349 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.105375 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.105384 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.105393 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.105401 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.208015 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.208058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.208070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.208083 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.208094 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.309997 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.310026 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.310037 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.310051 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.310059 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.411844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.411913 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.411929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.411948 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.411960 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.513833 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.513899 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.513909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.513923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.513932 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.615530 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.615565 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.615574 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.615586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.615630 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.640799 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:00:32.049434583 +0000 UTC Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.654261 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.654313 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:25 crc kubenswrapper[4546]: E0201 06:43:25.654404 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.654432 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:25 crc kubenswrapper[4546]: E0201 06:43:25.654540 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:25 crc kubenswrapper[4546]: E0201 06:43:25.654611 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.717301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.717330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.717339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.717350 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.717358 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.818998 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.819046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.819058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.819072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.819085 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.921310 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.921356 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.921368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.921382 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.921392 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:25Z","lastTransitionTime":"2026-02-01T06:43:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.951293 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.965271 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.974083 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.983511 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:25 crc kubenswrapper[4546]: I0201 06:43:25.992609 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.001698 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.014738 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.022892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.022932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.022946 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.022969 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.022983 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.024969 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.035601 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.045249 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.053046 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.066551 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.076779 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.084823 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.094592 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.103842 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.113280 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.124989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.125031 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.125042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.125059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.125070 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.167672 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:26 crc kubenswrapper[4546]: E0201 06:43:26.167803 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:26 crc kubenswrapper[4546]: E0201 06:43:26.167907 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:30.167887685 +0000 UTC m=+40.818823711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.226496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.226530 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.226539 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.226549 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.226558 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.328706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.328758 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.328771 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.328796 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.328809 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.430600 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.430628 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.430637 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.430649 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.430658 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.532543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.532570 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.532578 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.532588 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.532595 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.634955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.634994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.635006 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.635021 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.635033 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.641360 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 09:18:04.956261952 +0000 UTC Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.654800 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:26 crc kubenswrapper[4546]: E0201 06:43:26.654916 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.737081 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.737142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.737159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.737179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.737195 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.839393 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.839426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.839442 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.839455 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.839465 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.941442 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.941489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.941499 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.941513 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:26 crc kubenswrapper[4546]: I0201 06:43:26.941524 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:26Z","lastTransitionTime":"2026-02-01T06:43:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.043331 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.043381 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.043405 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.043418 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.043427 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.145248 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.145303 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.145317 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.145349 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.145362 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.247812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.247843 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.247851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.247887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.247897 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.349827 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.349878 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.349889 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.349900 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.349908 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.451979 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.452028 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.452045 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.452062 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.452082 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.554115 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.554137 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.554144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.554154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.554164 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.642028 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:42:49.157308992 +0000 UTC Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.654356 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.654455 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.654356 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:27 crc kubenswrapper[4546]: E0201 06:43:27.654641 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:27 crc kubenswrapper[4546]: E0201 06:43:27.654745 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:27 crc kubenswrapper[4546]: E0201 06:43:27.654803 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.655397 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.655421 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.655437 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.655449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.655457 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.758005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.758034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.758046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.758059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.758068 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.859955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.859981 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.859990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.860024 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.860033 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.962306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.962337 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.962345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.962361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:27 crc kubenswrapper[4546]: I0201 06:43:27.962369 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:27Z","lastTransitionTime":"2026-02-01T06:43:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.064048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.064089 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.064098 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.064112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.064122 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.166053 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.166089 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.166122 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.166133 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.166140 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.268266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.268305 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.268315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.268329 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.268338 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.370274 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.370304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.370315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.370325 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.370333 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.472409 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.472435 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.472446 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.472455 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.472466 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.574838 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.574885 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.574894 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.574904 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.574913 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.643069 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 13:24:14.164302168 +0000 UTC Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.654388 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:28 crc kubenswrapper[4546]: E0201 06:43:28.654496 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.677338 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.677391 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.677404 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.677430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.677447 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779383 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779723 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.779734 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.780410 4546 scope.go:117] "RemoveContainer" containerID="7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.882179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.882221 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.882231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.882246 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.882257 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.907900 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/1.log" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.919276 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.919904 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.930380 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.938758 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.947985 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.955460 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.968098 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.978627 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.984076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.984108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.984117 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.984129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.984140 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:28Z","lastTransitionTime":"2026-02-01T06:43:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:28 crc kubenswrapper[4546]: I0201 06:43:28.988312 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:28Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.002047 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.012589 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.024221 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.032163 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.046764 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.059529 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.083542 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.086530 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.086554 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.086563 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.086578 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.086588 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.095779 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.118611 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.189269 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.189315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.189328 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.189345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.189359 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.292228 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.292287 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.292300 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.292363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.292388 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.394763 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.394827 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.394844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.394885 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.394900 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.503664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.503706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.503717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.503733 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.503747 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.605999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.606035 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.606046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.606057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.606066 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.643179 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 22:52:43.803478988 +0000 UTC Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.654501 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:29 crc kubenswrapper[4546]: E0201 06:43:29.654624 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.654874 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.654993 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:29 crc kubenswrapper[4546]: E0201 06:43:29.655083 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:29 crc kubenswrapper[4546]: E0201 06:43:29.655259 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.669913 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.681834 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.692684 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.701995 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.707458 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.707499 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.707510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.707528 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.707539 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.716658 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.730876 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.741799 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.750755 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.760389 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.769832 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.779115 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.786827 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.799529 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.809317 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.809364 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.809386 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.809414 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.809425 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.812242 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.822825 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.833837 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.911614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.911642 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.911652 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.911680 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.911725 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:29Z","lastTransitionTime":"2026-02-01T06:43:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.924408 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/2.log" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.925009 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/1.log" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.927447 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" exitCode=1 Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.927513 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe"} Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.927552 4546 scope.go:117] "RemoveContainer" containerID="7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.928237 4546 scope.go:117] "RemoveContainer" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" Feb 01 06:43:29 crc kubenswrapper[4546]: E0201 06:43:29.928409 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.945692 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.960054 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4f929350b1a0d736a29e6bab4dda03b88d27bb336a21305de964f2e84f2a36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:18Z\\\",\\\"message\\\":\\\"8.555827 5874 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0201 06:43:18.556010 5874 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0201 06:43:18.556071 5874 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:18Z is after 2025-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.969197 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.978119 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.986565 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:29 crc kubenswrapper[4546]: I0201 06:43:29.993706 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.001043 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.009579 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.013296 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.013333 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.013346 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.013360 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.013368 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.017853 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.025537 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.033257 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.044775 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.052584 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.062549 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.071252 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.079821 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.115648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.115688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.115716 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.115748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.115763 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.210328 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.210435 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.210504 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:38.210488809 +0000 UTC m=+48.861424825 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.217397 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.217424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.217434 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.217447 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.217457 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.319602 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.319629 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.319639 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.319652 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.319662 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.421826 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.421892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.421903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.421914 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.421923 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.523586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.523653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.523666 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.523679 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.523689 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.626003 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.626036 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.626046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.626057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.626066 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.643455 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 00:23:31.965276609 +0000 UTC Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.654562 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.654735 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.727838 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.727894 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.727907 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.727928 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.727939 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.829903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.829929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.829939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.829951 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.829961 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931602 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931729 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/2.log" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.931745 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.935839 4546 scope.go:117] "RemoveContainer" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.936061 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.945693 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.949129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.949164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.949176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.949192 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.949253 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.958867 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.959608 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.962148 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.962198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.962211 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.962228 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.962240 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.971217 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.974934 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.978911 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.978945 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.978960 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.978981 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.978995 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.982849 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: E0201 06:43:30.988308 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.990689 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.991466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.991525 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.991536 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.991577 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:30 crc kubenswrapper[4546]: I0201 06:43:30.991593 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:30Z","lastTransitionTime":"2026-02-01T06:43:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.004430 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.005657 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.009341 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.009463 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.009546 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.009627 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.009685 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.014556 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.019149 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.019403 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.029576 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.033740 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.033779 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.033790 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.033822 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.033832 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.039225 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.048356 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.057285 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.066642 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.074958 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.083358 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.093936 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.101350 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:31Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.136168 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.136232 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.136246 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.136267 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.136296 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.237899 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.237928 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.237938 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.237955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.237966 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.340152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.340192 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.340203 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.340217 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.340226 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.442508 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.442545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.442556 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.442571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.442581 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.544977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.545019 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.545033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.545084 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.545098 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.644183 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:50:25.417348936 +0000 UTC Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.647571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.647610 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.647622 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.647636 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.647646 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.655039 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.655043 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.655181 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.655334 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.655450 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:31 crc kubenswrapper[4546]: E0201 06:43:31.655582 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.749683 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.749732 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.749745 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.749757 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.749769 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.851954 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.851982 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.851995 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.852011 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.852021 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.954210 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.954293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.954307 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.954329 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:31 crc kubenswrapper[4546]: I0201 06:43:31.954342 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:31Z","lastTransitionTime":"2026-02-01T06:43:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.056388 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.056506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.056602 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.056700 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.056778 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.158851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.159034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.159108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.159170 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.159423 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.262204 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.262358 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.262440 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.262505 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.262572 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.364738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.364788 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.364803 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.364826 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.364837 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.466934 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.467067 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.467134 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.467193 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.467256 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.569397 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.569468 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.569507 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.569535 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.569552 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.644545 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:59:35.100115207 +0000 UTC Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.654887 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:32 crc kubenswrapper[4546]: E0201 06:43:32.655043 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.671638 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.671756 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.671882 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.671989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.672047 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.774824 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.774880 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.774891 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.774909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.774918 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.876755 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.876921 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.877002 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.877072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.877136 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.979054 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.979080 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.979089 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.979100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:32 crc kubenswrapper[4546]: I0201 06:43:32.979107 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:32Z","lastTransitionTime":"2026-02-01T06:43:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.080936 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.080977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.080987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.081007 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.081018 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.183269 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.183303 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.183313 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.183324 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.183332 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.285256 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.285282 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.285292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.285306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.285318 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.387043 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.387179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.387261 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.387320 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.387367 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.489165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.489315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.489396 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.489457 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.489521 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.592043 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.592081 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.592094 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.592108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.592121 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.644782 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:20:35.247954471 +0000 UTC Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.654061 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.654070 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:33 crc kubenswrapper[4546]: E0201 06:43:33.654154 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:33 crc kubenswrapper[4546]: E0201 06:43:33.654240 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.654331 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:33 crc kubenswrapper[4546]: E0201 06:43:33.654385 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.693752 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.693795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.693809 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.693827 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.693842 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.795804 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.795840 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.795852 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.795884 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.795899 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.897812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.897871 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.897884 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.897903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:33 crc kubenswrapper[4546]: I0201 06:43:33.897916 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:33Z","lastTransitionTime":"2026-02-01T06:43:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:33.999967 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.000252 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.000345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.000410 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.000463 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.102634 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.102670 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.102682 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.102694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.102704 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.204819 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.204871 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.204883 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.204897 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.204910 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.306832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.306883 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.306895 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.306910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.306920 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.408844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.409005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.409062 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.409129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.409189 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.510715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.510757 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.510768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.510778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.510789 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.612989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.613380 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.613460 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.613551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.613630 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.645523 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:30:54.871038865 +0000 UTC Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.654232 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:34 crc kubenswrapper[4546]: E0201 06:43:34.654404 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.715961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.715995 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.716007 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.716037 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.716049 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.818375 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.818412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.818422 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.818435 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.818445 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.920540 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.920575 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.920584 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.920598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:34 crc kubenswrapper[4546]: I0201 06:43:34.920609 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:34Z","lastTransitionTime":"2026-02-01T06:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.023265 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.023302 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.023313 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.023328 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.023341 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.124987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.125030 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.125038 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.125059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.125071 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.227076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.227114 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.227124 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.227136 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.227145 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.328762 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.328791 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.328801 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.328832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.328842 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.435944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.436525 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.436543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.436563 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.436573 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.538604 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.538646 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.538656 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.538671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.538682 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.640136 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.640167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.640176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.640189 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.640199 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.646568 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 17:07:20.159782302 +0000 UTC Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.653992 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.654104 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.654052 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:35 crc kubenswrapper[4546]: E0201 06:43:35.654280 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:35 crc kubenswrapper[4546]: E0201 06:43:35.654380 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:35 crc kubenswrapper[4546]: E0201 06:43:35.655263 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.742106 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.742145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.742157 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.742173 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.742185 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.843604 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.843661 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.843672 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.843685 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.843695 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.945779 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.945825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.945836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.945869 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:35 crc kubenswrapper[4546]: I0201 06:43:35.945890 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:35Z","lastTransitionTime":"2026-02-01T06:43:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.048403 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.048437 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.048447 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.048461 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.048470 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.150983 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.151013 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.151022 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.151034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.151045 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.253197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.253240 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.253254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.253273 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.253286 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.354980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.355020 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.355031 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.355046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.355056 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.457042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.457076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.457090 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.457104 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.457113 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.559654 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.559687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.559696 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.559709 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.559719 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.647012 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:15:23.62512191 +0000 UTC Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.654797 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:36 crc kubenswrapper[4546]: E0201 06:43:36.654934 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.655823 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.661068 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.661104 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.661112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.661123 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.661134 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.664669 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.667021 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.676040 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.688164 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.697386 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.707982 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.715689 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.723078 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.730973 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.739781 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.749907 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.758521 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.762975 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.763012 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.763022 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.763035 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.763043 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.769071 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.776117 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.786462 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.795749 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.803828 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:36Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.865840 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.865897 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.865910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.865925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.865935 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.967257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.967284 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.967292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.967302 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:36 crc kubenswrapper[4546]: I0201 06:43:36.967313 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:36Z","lastTransitionTime":"2026-02-01T06:43:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.070055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.070095 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.070110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.070126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.070139 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.172033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.172072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.172091 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.172107 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.172117 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.274279 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.274308 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.274316 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.274327 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.274334 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.376118 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.376149 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.376159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.376171 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.376185 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.478184 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.478231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.478244 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.478266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.478279 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.579650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.579684 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.579717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.579734 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.579745 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.647301 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 00:14:18.147244951 +0000 UTC Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.654809 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.654836 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.654821 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:37 crc kubenswrapper[4546]: E0201 06:43:37.654962 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:37 crc kubenswrapper[4546]: E0201 06:43:37.655041 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:37 crc kubenswrapper[4546]: E0201 06:43:37.655351 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.681650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.681695 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.681705 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.681726 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.681739 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.783895 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.783927 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.783938 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.783969 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.783979 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.885825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.885849 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.885975 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.885997 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.886006 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.987771 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.987794 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.987803 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.987814 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:37 crc kubenswrapper[4546]: I0201 06:43:37.987837 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:37Z","lastTransitionTime":"2026-02-01T06:43:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.089078 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.089122 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.089136 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.089150 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.089161 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.191286 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.191330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.191339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.191353 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.191367 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.278437 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:38 crc kubenswrapper[4546]: E0201 06:43:38.278601 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:38 crc kubenswrapper[4546]: E0201 06:43:38.278662 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:43:54.278644031 +0000 UTC m=+64.929580047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.292759 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.292783 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.292792 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.292804 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.292811 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.394450 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.394490 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.394517 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.394533 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.394543 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.497238 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.497263 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.497276 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.497287 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.497295 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.599664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.599711 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.599721 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.599741 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.599771 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.648257 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:05:12.431261726 +0000 UTC Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.654564 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:38 crc kubenswrapper[4546]: E0201 06:43:38.655139 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.701809 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.701845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.701873 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.701888 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.701899 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.803925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.803986 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.803999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.804014 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.804023 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.905685 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.905717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.905727 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.905754 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:38 crc kubenswrapper[4546]: I0201 06:43:38.905763 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:38Z","lastTransitionTime":"2026-02-01T06:43:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.007254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.007301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.007313 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.007335 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.007349 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.109565 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.109602 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.109611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.109624 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.109634 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.212090 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.212259 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.212363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.212451 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.212576 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.314771 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.314894 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.314971 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.315070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.315325 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.389234 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.389486 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:11.389466502 +0000 UTC m=+82.040402519 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.389742 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.389951 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.390145 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:11.390135634 +0000 UTC m=+82.041071650 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.390245 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.390408 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.390525 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:11.39048908 +0000 UTC m=+82.041425096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.417471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.417518 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.417529 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.417553 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.417567 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.490773 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.490817 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.490991 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491016 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491029 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491061 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491099 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491119 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491073 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:11.491059898 +0000 UTC m=+82.141995914 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.491209 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:11.491186506 +0000 UTC m=+82.142122532 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.519145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.519213 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.519226 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.519254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.519268 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.621385 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.621430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.621442 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.621460 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.621471 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.648702 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:23:37.570294712 +0000 UTC Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.654101 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.654239 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.654479 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.654530 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.654769 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:39 crc kubenswrapper[4546]: E0201 06:43:39.654941 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.669950 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.681046 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.693423 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.703192 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.715755 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.723664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.723707 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.723718 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.723735 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.723746 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.725600 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.733190 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.741624 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.750252 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.764115 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.773580 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.781491 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.789807 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.798916 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.807319 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.813814 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.825677 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.825779 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.825840 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.825939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.826002 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.826436 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.927965 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.928008 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.928019 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.928039 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:39 crc kubenswrapper[4546]: I0201 06:43:39.928056 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:39Z","lastTransitionTime":"2026-02-01T06:43:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.030619 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.030660 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.030671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.030689 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.030705 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.132763 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.132885 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.132944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.133002 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.133062 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.235881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.235926 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.235942 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.235964 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.235977 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.337665 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.337698 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.337708 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.337724 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.337734 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.439648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.439677 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.439685 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.439694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.439701 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.541928 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.541972 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.541987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.541999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.542009 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.644205 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.644253 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.644263 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.644283 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.644295 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.649479 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 12:17:57.842719423 +0000 UTC Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.654735 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:40 crc kubenswrapper[4546]: E0201 06:43:40.654836 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.746106 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.746132 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.746142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.746155 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.746164 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.849019 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.849038 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.849048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.849061 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.849069 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.950460 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.950600 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.950687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.950754 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:40 crc kubenswrapper[4546]: I0201 06:43:40.950817 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:40Z","lastTransitionTime":"2026-02-01T06:43:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.052722 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.052752 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.052763 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.052778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.052789 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.154159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.154182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.154188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.154197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.154204 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.256193 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.256221 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.256229 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.256258 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.256267 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.312696 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.312755 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.312766 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.312787 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.312801 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.324507 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:41Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.328525 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.328571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.328581 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.328592 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.328599 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.337946 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:41Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.340562 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.340603 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.340614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.340631 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.340641 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.349887 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:41Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.352596 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.352686 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.352739 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.352800 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.352870 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.361653 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:41Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.364212 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.364252 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.364264 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.364273 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.364280 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.372169 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:41Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.372318 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.373219 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.373255 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.373262 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.373272 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.373279 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.475426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.475567 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.475636 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.475699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.475754 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.577698 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.577818 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.577906 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.577985 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.578048 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.649813 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:11:04.405042629 +0000 UTC Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.654163 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.654207 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.654192 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.654340 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.654426 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:41 crc kubenswrapper[4546]: E0201 06:43:41.654542 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.680444 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.680468 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.680479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.680490 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.680506 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.782953 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.782989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.782999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.783015 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.783029 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.884470 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.884509 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.884518 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.884532 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.884545 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.986076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.986179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.986240 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.986313 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:41 crc kubenswrapper[4546]: I0201 06:43:41.986370 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:41Z","lastTransitionTime":"2026-02-01T06:43:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.088072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.088099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.088108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.088118 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.088127 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.189887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.189941 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.189951 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.189964 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.189974 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.291587 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.291613 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.291620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.291629 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.291638 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.393783 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.393810 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.393819 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.393830 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.393837 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.495768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.495840 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.495895 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.495921 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.495934 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.598271 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.598317 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.598327 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.598338 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.598346 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.650303 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 23:46:20.406133503 +0000 UTC Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.654664 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:42 crc kubenswrapper[4546]: E0201 06:43:42.654854 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.700441 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.700465 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.700476 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.700487 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.700497 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.801968 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.802032 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.802043 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.802061 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.802076 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.903811 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.903847 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.903875 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.903893 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:42 crc kubenswrapper[4546]: I0201 06:43:42.903902 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:42Z","lastTransitionTime":"2026-02-01T06:43:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.005914 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.005966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.005978 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.005991 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.006003 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.108112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.108144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.108173 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.108186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.108194 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.209892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.209922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.209933 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.209944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.209953 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.314023 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.314055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.314065 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.314079 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.314089 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.415485 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.415521 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.415530 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.415542 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.415556 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.517510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.517535 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.517547 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.517558 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.517566 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.619145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.619172 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.619181 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.619191 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.619199 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.650606 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:57:14.711634921 +0000 UTC Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.653832 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.653939 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:43 crc kubenswrapper[4546]: E0201 06:43:43.654003 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.653833 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:43 crc kubenswrapper[4546]: E0201 06:43:43.654058 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:43 crc kubenswrapper[4546]: E0201 06:43:43.653934 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.721104 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.721142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.721150 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.721159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.721166 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.823154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.823190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.823197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.823208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.823219 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.927691 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.927735 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.927745 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.927761 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:43 crc kubenswrapper[4546]: I0201 06:43:43.927769 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:43Z","lastTransitionTime":"2026-02-01T06:43:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.030016 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.030076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.030088 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.030110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.030125 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.131884 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.131999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.132063 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.132115 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.132168 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.234259 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.234280 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.234288 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.234297 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.234304 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.335827 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.335923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.335994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.336050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.336095 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.437212 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.437236 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.437272 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.437280 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.437287 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.538572 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.538678 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.538744 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.538798 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.538874 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.640254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.640345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.640424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.640481 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.640547 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.651704 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 18:14:09.716117811 +0000 UTC Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.653947 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:44 crc kubenswrapper[4546]: E0201 06:43:44.654102 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.742815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.742962 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.743015 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.743078 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.743129 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.844896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.844981 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.845037 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.845101 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.845160 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.947445 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.947537 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.947591 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.947642 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:44 crc kubenswrapper[4546]: I0201 06:43:44.947702 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:44Z","lastTransitionTime":"2026-02-01T06:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.049252 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.049272 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.049280 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.049289 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.049296 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.150836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.150868 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.150877 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.150886 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.150892 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.252760 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.252781 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.252790 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.252799 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.252807 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.354283 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.354389 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.354441 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.354490 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.354548 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.455997 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.456093 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.456160 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.456213 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.456260 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.558088 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.558132 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.558142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.558159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.558171 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.652071 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:57:39.47315937 +0000 UTC Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.654316 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.654330 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:45 crc kubenswrapper[4546]: E0201 06:43:45.654408 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.654435 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:45 crc kubenswrapper[4546]: E0201 06:43:45.654484 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:45 crc kubenswrapper[4546]: E0201 06:43:45.654540 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.654970 4546 scope.go:117] "RemoveContainer" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" Feb 01 06:43:45 crc kubenswrapper[4546]: E0201 06:43:45.655156 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.659077 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.659099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.659114 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.659126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.659134 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.760723 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.760749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.760758 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.760767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.760774 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.862097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.862124 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.862134 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.862146 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.862153 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.963549 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.963575 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.963585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.963595 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:45 crc kubenswrapper[4546]: I0201 06:43:45.963602 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:45Z","lastTransitionTime":"2026-02-01T06:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.064462 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.064489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.064498 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.064520 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.064529 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.166604 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.166653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.166666 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.166686 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.166697 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.269188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.269219 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.269227 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.269241 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.269251 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.370952 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.370983 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.370991 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.371002 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.371010 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.473060 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.473090 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.473099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.473112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.473125 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.575032 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.575315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.575324 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.575334 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.575342 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.652135 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:14:23.075029021 +0000 UTC Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.654904 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:46 crc kubenswrapper[4546]: E0201 06:43:46.655126 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.677091 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.677120 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.677130 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.677142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.677149 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.778841 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.778896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.778907 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.778918 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.778927 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.880746 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.880782 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.880792 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.880805 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.880813 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.983164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.983207 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.983216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.983233 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:46 crc kubenswrapper[4546]: I0201 06:43:46.983243 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:46Z","lastTransitionTime":"2026-02-01T06:43:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.085903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.085959 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.085970 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.085993 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.086022 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.188410 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.188467 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.188477 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.188500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.188527 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.291075 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.291124 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.291140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.291160 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.291174 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.393124 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.393167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.393179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.393194 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.393204 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.495305 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.495340 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.495354 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.495378 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.495392 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.598067 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.598098 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.598107 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.598119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.598135 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.653174 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:42:40.820259157 +0000 UTC Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.654448 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:47 crc kubenswrapper[4546]: E0201 06:43:47.654592 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.654620 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:47 crc kubenswrapper[4546]: E0201 06:43:47.654686 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.654608 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:47 crc kubenswrapper[4546]: E0201 06:43:47.654891 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.700294 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.700325 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.700333 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.700345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.700354 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.802651 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.802703 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.802716 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.802733 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.802745 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.904835 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.904890 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.904904 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.904919 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:47 crc kubenswrapper[4546]: I0201 06:43:47.904931 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:47Z","lastTransitionTime":"2026-02-01T06:43:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.006825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.006903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.006918 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.006939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.006954 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.109278 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.109424 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.109493 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.109594 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.109668 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.211931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.211968 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.211977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.211992 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.212006 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.314081 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.314120 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.314131 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.314145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.314157 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.416373 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.416543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.416607 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.416682 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.416750 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.525664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.525776 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.525919 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.525980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.526301 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.629116 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.629145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.629154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.629166 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.629174 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.653742 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:16:44.289191508 +0000 UTC Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.653885 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:48 crc kubenswrapper[4546]: E0201 06:43:48.654048 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.731887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.731922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.731936 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.731954 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.731967 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.834033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.834076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.834092 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.834113 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.834126 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.935974 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.936008 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.936017 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.936030 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:48 crc kubenswrapper[4546]: I0201 06:43:48.936041 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:48Z","lastTransitionTime":"2026-02-01T06:43:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.038299 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.038322 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.038334 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.038347 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.038354 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.140657 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.140685 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.140695 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.140706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.140715 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.242452 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.242479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.242489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.242500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.242506 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.344331 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.344492 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.344582 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.344654 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.344730 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.446681 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.446819 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.446932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.446998 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.447056 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.548890 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.548929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.548941 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.548955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.548965 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.651075 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.651224 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.651290 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.651359 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.651503 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.653837 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 22:57:25.849238635 +0000 UTC Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.653961 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.654255 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.654540 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:49 crc kubenswrapper[4546]: E0201 06:43:49.654679 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:49 crc kubenswrapper[4546]: E0201 06:43:49.654886 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:49 crc kubenswrapper[4546]: E0201 06:43:49.654940 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.670101 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.681030 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.689157 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.700675 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.708917 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.719185 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.727884 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.736353 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.743654 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.750431 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.752802 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.752829 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.752837 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.752868 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.752880 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.757609 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.765445 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.777257 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.785807 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.793987 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.800915 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.807327 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:49Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.854789 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.854819 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.854831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.854847 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.854881 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.956218 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.956268 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.956278 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.956292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:49 crc kubenswrapper[4546]: I0201 06:43:49.956301 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:49Z","lastTransitionTime":"2026-02-01T06:43:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.058127 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.058154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.058162 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.058175 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.058184 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.160505 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.160553 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.160565 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.160579 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.160590 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.262789 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.262830 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.262840 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.262869 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.262880 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.364780 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.364922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.365007 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.365072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.365149 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.466890 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.466923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.466932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.466946 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.466958 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.568482 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.568543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.568563 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.568586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.568600 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.654301 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 09:14:42.247308701 +0000 UTC Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.654461 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:50 crc kubenswrapper[4546]: E0201 06:43:50.654590 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.670542 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.670577 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.670586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.670598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.670609 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.773935 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.774167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.774244 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.774312 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.774371 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.876797 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.876823 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.876832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.876844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.876869 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.978909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.979114 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.979179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.979248 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:50 crc kubenswrapper[4546]: I0201 06:43:50.979315 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:50Z","lastTransitionTime":"2026-02-01T06:43:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.080767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.080794 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.080803 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.080812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.080820 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.183068 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.183220 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.183292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.183359 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.183415 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.285890 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.285930 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.285939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.285951 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.285960 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.388585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.388622 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.388632 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.388649 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.388660 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.490257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.490506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.490582 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.490647 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.490707 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.592208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.592236 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.592244 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.592254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.592261 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.619672 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.619781 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.619851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.619935 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.619993 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.630841 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.633488 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.633648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.633743 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.633815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.633903 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.642176 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.644910 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.644929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.644938 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.644950 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.644957 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.653420 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.654549 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 08:52:32.069888907 +0000 UTC Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.654655 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.654672 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.654703 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.654744 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.654811 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.654892 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.655961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.655989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.655998 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.656011 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.656019 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.665748 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.668167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.668190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.668198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.668208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.668217 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.675834 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:51 crc kubenswrapper[4546]: E0201 06:43:51.675959 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.693778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.693803 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.693812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.693820 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.693829 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.795270 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.795292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.795300 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.795308 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.795315 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.896396 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.896423 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.896433 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.896443 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.896452 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.998449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.998478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.998487 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.998503 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:51 crc kubenswrapper[4546]: I0201 06:43:51.998514 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:51Z","lastTransitionTime":"2026-02-01T06:43:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.103219 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.103245 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.103261 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.103272 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.103281 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.205064 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.205102 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.205113 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.205126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.205137 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.307060 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.307100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.307110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.307133 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.307144 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.408850 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.408887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.408895 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.408904 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.408911 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.511104 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.511137 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.511153 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.511165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.511173 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.613293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.613406 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.613477 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.613559 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.613618 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.654608 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:45:31.249039619 +0000 UTC Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.654649 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:52 crc kubenswrapper[4546]: E0201 06:43:52.654737 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.715316 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.715450 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.715533 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.715605 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.715662 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.817112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.817329 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.817393 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.817478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.817547 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.920001 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.920033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.920042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.920054 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:52 crc kubenswrapper[4546]: I0201 06:43:52.920065 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:52Z","lastTransitionTime":"2026-02-01T06:43:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.021295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.021449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.021510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.021594 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.021652 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.123845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.123980 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.124057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.124128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.124184 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.228584 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.228615 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.228623 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.228633 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.228641 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.330504 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.330548 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.330557 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.330571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.330581 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.432262 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.432288 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.432298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.432308 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.432317 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.533822 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.533848 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.533870 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.533888 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.533895 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.635783 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.636128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.636143 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.636198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.636211 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.654301 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.654347 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:53 crc kubenswrapper[4546]: E0201 06:43:53.654387 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.654398 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:53 crc kubenswrapper[4546]: E0201 06:43:53.654432 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:53 crc kubenswrapper[4546]: E0201 06:43:53.654481 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.654787 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 22:02:29.302721996 +0000 UTC Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.737580 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.737618 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.737630 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.737642 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.737653 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.839482 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.839512 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.839519 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.839539 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.839549 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.941226 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.941266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.941278 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.941293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:53 crc kubenswrapper[4546]: I0201 06:43:53.941304 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:53Z","lastTransitionTime":"2026-02-01T06:43:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.042507 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.042543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.042551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.042562 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.042570 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.144271 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.144298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.144306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.144317 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.144327 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.245684 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.245715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.245728 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.245738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.245747 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.317221 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:54 crc kubenswrapper[4546]: E0201 06:43:54.317324 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:54 crc kubenswrapper[4546]: E0201 06:43:54.317366 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:44:26.317353669 +0000 UTC m=+96.968289685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.347687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.347717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.347726 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.347738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.347745 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.449776 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.449814 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.449824 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.449836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.449844 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.551626 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.551677 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.551687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.551699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.551708 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.653784 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:54 crc kubenswrapper[4546]: E0201 06:43:54.653908 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.654244 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.654262 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.654269 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.654297 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.654305 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.655341 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 08:09:21.386326951 +0000 UTC Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.756469 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.756544 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.756557 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.756567 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.756575 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.858055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.858089 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.858097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.858108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.858116 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.959896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.959922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.959929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.959940 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:54 crc kubenswrapper[4546]: I0201 06:43:54.959949 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:54Z","lastTransitionTime":"2026-02-01T06:43:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.060930 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.060954 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.060963 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.060973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.060980 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.169621 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.171278 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.171396 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.171471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.171565 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.273918 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.273966 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.273978 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.273994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.274022 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.376172 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.376378 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.376447 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.376522 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.376598 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.485935 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.485963 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.485972 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.485985 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.485995 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.587352 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.587370 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.587377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.587389 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.587398 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.655099 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.655143 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:55 crc kubenswrapper[4546]: E0201 06:43:55.655221 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.655358 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.655387 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:20:01.740488537 +0000 UTC Feb 01 06:43:55 crc kubenswrapper[4546]: E0201 06:43:55.655459 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:55 crc kubenswrapper[4546]: E0201 06:43:55.655575 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.688583 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.688612 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.688622 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.688635 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.688645 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.790973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.791006 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.791019 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.791034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.791045 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.892375 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.892404 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.892412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.892422 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.892430 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.994521 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.994550 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.994557 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.994566 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:55 crc kubenswrapper[4546]: I0201 06:43:55.994573 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:55Z","lastTransitionTime":"2026-02-01T06:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.008470 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/0.log" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.008520 4546 generic.go:334] "Generic (PLEG): container finished" podID="95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16" containerID="bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271" exitCode=1 Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.008554 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerDied","Data":"bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.008805 4546 scope.go:117] "RemoveContainer" containerID="bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.024108 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.034477 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.045687 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.055199 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.063176 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.071649 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.082925 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.096648 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.096781 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.097084 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.097157 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.097234 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.097305 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.121683 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.132872 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.142782 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.153957 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.164810 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.177049 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.186398 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.201171 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.207944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.210922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.211005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.211033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.211083 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.214378 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.313212 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.313257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.313269 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.313293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.313307 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.415152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.415190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.415202 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.415220 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.415240 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.517013 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.517050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.517060 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.517078 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.517098 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.619393 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.619426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.619441 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.619459 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.619474 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.654557 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:56 crc kubenswrapper[4546]: E0201 06:43:56.654726 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.655787 4546 scope.go:117] "RemoveContainer" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.656022 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 09:59:45.946025192 +0000 UTC Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.721962 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.722005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.722016 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.722032 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.722042 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.825990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.826029 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.826041 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.826066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.826077 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.927987 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.928020 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.928030 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.928042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:56 crc kubenswrapper[4546]: I0201 06:43:56.928050 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:56Z","lastTransitionTime":"2026-02-01T06:43:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.014431 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/0.log" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.014506 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerStarted","Data":"6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.016739 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/2.log" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.019256 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.019569 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.030228 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.030253 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.030261 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.030287 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.030299 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.032557 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.049167 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.061687 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.071611 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.079498 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.091784 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.101053 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.110681 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.121035 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.128927 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.132558 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.132589 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.132599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.132612 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.132620 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.141413 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.159465 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.169709 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.179359 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.189974 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.197940 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.215477 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.225969 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.234260 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.234294 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.234305 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.234322 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.234332 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.235934 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.247382 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.257208 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.268526 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.288219 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.296749 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.304304 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.311809 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.321776 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.330741 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.337242 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.337273 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.337282 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.337298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.337309 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.341633 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.355681 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.364149 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.373893 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.382243 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.393227 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.439463 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.439493 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.439506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.439523 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.439564 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.541789 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.541827 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.541837 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.541851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.541879 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.644007 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.644040 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.644049 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.644063 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.644073 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.654818 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:57 crc kubenswrapper[4546]: E0201 06:43:57.654943 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.654984 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.655005 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:57 crc kubenswrapper[4546]: E0201 06:43:57.655090 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:57 crc kubenswrapper[4546]: E0201 06:43:57.655249 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.656788 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 13:10:17.756527343 +0000 UTC Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.746018 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.746054 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.746067 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.746081 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.746092 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.848466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.848494 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.848505 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.848518 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.848527 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.950775 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.950807 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.950818 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.950831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:57 crc kubenswrapper[4546]: I0201 06:43:57.950840 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:57Z","lastTransitionTime":"2026-02-01T06:43:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.023988 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/3.log" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.025131 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/2.log" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.028955 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" exitCode=1 Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.029013 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.029075 4546 scope.go:117] "RemoveContainer" containerID="b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.030441 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:43:58 crc kubenswrapper[4546]: E0201 06:43:58.030651 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.050845 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.052473 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.052517 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.052553 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.052568 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.052581 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.062046 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.071766 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.079436 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.087968 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.097973 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.110573 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9ff5f333b1fcdb65517329eb75a9697d59154a2b6e88710b98a99861157bbfe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:29Z\\\",\\\"message\\\":\\\"}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0201 06:43:29.458061 6060 services_controller.go:444] Built service openshift-ingress-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF0201 06:43:29.458061 6060 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:29Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:43:29.458073 6060 services_controller.go:445] Built service openshift-ingress-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0201 06:43:29.458087 6060 services_controller.go:451] Built service openshift-ingress-\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:43:57.321129 6454 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:43:57.321591 6454 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:43:57.321638 6454 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:57.321765 6454 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:57.321830 6454 factory.go:656] Stopping watch factory\\\\nI0201 06:43:57.321844 6454 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:57.322001 6454 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:43:57.379267 6454 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0201 06:43:57.379284 6454 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0201 06:43:57.379350 6454 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:57.379379 6454 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0201 06:43:57.379449 6454 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.118402 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.126343 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.134369 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.142312 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.153422 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.154159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.154180 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.154188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.154199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.154206 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.162956 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.173037 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.181680 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.190261 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.203662 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.257024 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.257066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.257079 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.257109 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.257123 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.358850 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.358906 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.358916 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.358931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.358941 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.460806 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.460831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.460872 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.460887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.460896 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.562466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.562492 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.562501 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.562512 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.562522 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.654177 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:43:58 crc kubenswrapper[4546]: E0201 06:43:58.654412 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.657237 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 18:04:48.318124608 +0000 UTC Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.664347 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.664368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.664378 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.664388 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.664395 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.766468 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.766510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.766523 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.766557 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.766571 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.867982 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.868017 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.868046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.868058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.868067 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.970441 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.970473 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.970486 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.970501 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:58 crc kubenswrapper[4546]: I0201 06:43:58.970510 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:58Z","lastTransitionTime":"2026-02-01T06:43:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.033815 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/3.log" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.036949 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:43:59 crc kubenswrapper[4546]: E0201 06:43:59.037221 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.062032 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072135 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072613 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072640 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072670 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.072680 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.082409 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.093845 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.102895 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.112490 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.122279 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.130692 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.140828 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.150071 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.157502 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.166146 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.174216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.174295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.174309 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.174329 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.174340 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.176213 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.198235 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:43:57.321129 6454 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:43:57.321591 6454 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:43:57.321638 6454 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:57.321765 6454 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:57.321830 6454 factory.go:656] Stopping watch factory\\\\nI0201 06:43:57.321844 6454 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:57.322001 6454 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:43:57.379267 6454 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0201 06:43:57.379284 6454 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0201 06:43:57.379350 6454 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:57.379379 6454 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0201 06:43:57.379449 6454 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.222884 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.247447 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.257057 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.276818 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.276844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.276877 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.276894 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.276905 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.378844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.378892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.378904 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.378922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.378936 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.481034 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.481068 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.481083 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.481097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.481109 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.583295 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.583328 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.583337 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.583349 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.583357 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.654217 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:43:59 crc kubenswrapper[4546]: E0201 06:43:59.654324 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.654487 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:43:59 crc kubenswrapper[4546]: E0201 06:43:59.654547 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.654655 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:43:59 crc kubenswrapper[4546]: E0201 06:43:59.654702 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.657606 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 06:36:58.546201249 +0000 UTC Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.669832 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.685479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.685580 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.685638 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.685690 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.685747 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.686383 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:43:57.321129 6454 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:43:57.321591 6454 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:43:57.321638 6454 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:57.321765 6454 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:57.321830 6454 factory.go:656] Stopping watch factory\\\\nI0201 06:43:57.321844 6454 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:57.322001 6454 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:43:57.379267 6454 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0201 06:43:57.379284 6454 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0201 06:43:57.379350 6454 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:57.379379 6454 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0201 06:43:57.379449 6454 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.702367 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.711168 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.720168 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.726956 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.735431 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.742713 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.751030 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.758767 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.767843 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.777961 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787186 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787636 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787725 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787893 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.787978 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.797355 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.807010 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.817183 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.825096 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:43:59Z is after 2025-08-24T17:21:41Z" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.890133 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.890164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.890173 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.890190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.890203 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.991852 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.991929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.991941 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.991959 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:43:59 crc kubenswrapper[4546]: I0201 06:43:59.991972 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:43:59Z","lastTransitionTime":"2026-02-01T06:43:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.097090 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.097126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.097138 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.097154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.097165 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.199491 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.199590 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.199654 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.199708 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.199756 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.301630 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.301669 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.301682 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.301699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.301712 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.403064 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.403091 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.403107 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.403122 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.403131 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.504971 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.505004 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.505014 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.505027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.505038 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.607880 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.607908 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.607939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.607951 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.607962 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.654286 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:00 crc kubenswrapper[4546]: E0201 06:44:00.654381 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.658373 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 15:51:55.141344668 +0000 UTC Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.709834 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.709881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.709892 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.709906 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.709916 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.812353 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.812698 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.812764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.812834 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.812910 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.914971 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.915006 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.915014 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.915030 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:00 crc kubenswrapper[4546]: I0201 06:44:00.915039 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:00Z","lastTransitionTime":"2026-02-01T06:44:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.017269 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.017362 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.017441 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.017515 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.017580 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.126304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.126412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.126488 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.126566 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.126629 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.229152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.229197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.229208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.229223 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.229235 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.331920 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.331950 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.331961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.331976 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.331985 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.433556 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.433651 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.433714 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.433774 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.433824 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.535352 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.535402 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.535413 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.535427 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.535440 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.637648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.637749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.637829 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.637935 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.637990 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.654046 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.654148 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.654155 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.654434 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.654554 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.654722 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.658527 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:21:14.697695998 +0000 UTC Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.740050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.740105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.740120 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.740140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.740158 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.832506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.832559 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.832570 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.832589 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.832605 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.850265 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.853996 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.854028 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.854039 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.854055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.854065 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.864434 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.867898 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.867920 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.867931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.867945 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.867954 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.877587 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.880426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.880466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.880479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.880496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.880509 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.890282 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.893184 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.893222 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.893239 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.893258 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.893272 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.902968 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:01 crc kubenswrapper[4546]: E0201 06:44:01.903073 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.904129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.904166 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.904182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.904199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:01 crc kubenswrapper[4546]: I0201 06:44:01.904211 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:01Z","lastTransitionTime":"2026-02-01T06:44:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.006144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.006182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.006194 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.006208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.006218 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.108557 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.108596 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.108612 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.108629 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.108640 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.210541 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.210599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.210610 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.210628 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.210642 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.313353 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.313391 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.313399 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.313413 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.313423 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.415110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.415142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.415154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.415182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.415196 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.517699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.517727 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.517736 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.517749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.517758 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.621445 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.621477 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.621487 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.621501 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.621512 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.653830 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:02 crc kubenswrapper[4546]: E0201 06:44:02.653964 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.658949 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 09:22:34.656263549 +0000 UTC Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.723468 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.723496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.723507 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.723524 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.723541 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.826656 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.826693 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.826705 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.826729 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.826739 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.928567 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.928611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.928621 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.928632 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:02 crc kubenswrapper[4546]: I0201 06:44:02.928641 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:02Z","lastTransitionTime":"2026-02-01T06:44:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.030778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.030831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.030844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.030880 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.030891 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.133157 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.133221 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.133231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.133244 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.133252 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.234957 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.234978 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.234989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.234999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.235009 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.336688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.336704 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.336713 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.336721 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.336728 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.438820 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.438842 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.438851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.438876 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.438883 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.542765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.542799 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.542809 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.542821 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.542830 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.644449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.644469 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.644478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.644487 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.644494 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.655953 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:03 crc kubenswrapper[4546]: E0201 06:44:03.656043 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.656164 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:03 crc kubenswrapper[4546]: E0201 06:44:03.656209 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.656313 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:03 crc kubenswrapper[4546]: E0201 06:44:03.656362 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.659433 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 06:07:09.68176381 +0000 UTC Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.746380 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.746400 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.746408 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.746512 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.746556 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.848565 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.848620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.848633 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.848648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.848657 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.950639 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.950668 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.950677 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.950687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:03 crc kubenswrapper[4546]: I0201 06:44:03.950715 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:03Z","lastTransitionTime":"2026-02-01T06:44:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.053568 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.053607 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.053617 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.053630 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.053642 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.155363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.155404 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.155415 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.155431 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.155441 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.257105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.257154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.257165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.257193 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.257208 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.358994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.359032 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.359041 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.359054 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.359064 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.461522 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.461561 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.461570 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.461581 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.461590 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.562845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.562905 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.562916 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.562929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.562938 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.654260 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:04 crc kubenswrapper[4546]: E0201 06:44:04.654416 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.660287 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 20:19:16.457452928 +0000 UTC Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.664713 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.664759 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.664769 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.664780 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.664789 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.766255 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.766281 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.766290 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.766301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.766308 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.867507 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.867539 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.867549 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.867559 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.867567 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.969344 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.969395 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.969405 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.969420 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:04 crc kubenswrapper[4546]: I0201 06:44:04.969430 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:04Z","lastTransitionTime":"2026-02-01T06:44:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.070688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.070712 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.070722 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.070733 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.070741 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.172955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.172973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.172981 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.172990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.172998 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.275066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.275088 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.275098 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.275108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.275115 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.376781 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.376925 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.377011 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.377074 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.377126 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.480130 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.480182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.480192 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.480212 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.480223 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.582070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.582097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.582105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.582114 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.582139 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.654069 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.654123 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.654221 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:05 crc kubenswrapper[4546]: E0201 06:44:05.654212 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:05 crc kubenswrapper[4546]: E0201 06:44:05.654336 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:05 crc kubenswrapper[4546]: E0201 06:44:05.654383 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.660355 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:55:56.452392368 +0000 UTC Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.684360 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.684396 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.684406 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.684420 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.684429 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.786126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.786152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.786161 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.786171 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.786178 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.888043 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.888098 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.888114 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.888134 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.888152 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.990544 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.990592 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.990602 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.990613 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:05 crc kubenswrapper[4546]: I0201 06:44:05.990625 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:05Z","lastTransitionTime":"2026-02-01T06:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.093058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.093095 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.093105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.093121 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.093133 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.194571 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.194604 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.194611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.194627 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.194639 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.297174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.297216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.297226 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.297241 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.297254 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.399146 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.399178 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.399190 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.399202 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.399210 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.501307 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.501355 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.501365 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.501383 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.501398 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.603620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.603652 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.603661 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.603674 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.603683 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.654154 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:06 crc kubenswrapper[4546]: E0201 06:44:06.654258 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.661259 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:25:05.658108075 +0000 UTC Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.705309 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.705372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.705388 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.705409 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.705423 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.808281 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.808316 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.808325 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.808339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.808347 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.911439 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.911481 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.911491 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.911509 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:06 crc kubenswrapper[4546]: I0201 06:44:06.911519 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:06Z","lastTransitionTime":"2026-02-01T06:44:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.013599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.013635 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.013647 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.013658 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.013668 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.116711 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.116741 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.116754 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.116766 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.116775 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.219903 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.220004 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.220073 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.220144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.220202 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.322022 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.322062 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.322072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.322086 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.322096 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.424473 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.424600 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.424672 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.424728 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.424786 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.526675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.526703 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.526711 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.526722 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.526733 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.629142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.629197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.629208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.629231 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.629244 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.654720 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.654727 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:07 crc kubenswrapper[4546]: E0201 06:44:07.654918 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:07 crc kubenswrapper[4546]: E0201 06:44:07.654968 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.654747 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:07 crc kubenswrapper[4546]: E0201 06:44:07.655063 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.661799 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:35:07.30327003 +0000 UTC Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.732109 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.732179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.732191 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.732216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.732236 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.835091 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.835128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.835136 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.835151 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.835164 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.938007 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.938045 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.938057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.938075 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:07 crc kubenswrapper[4546]: I0201 06:44:07.938084 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:07Z","lastTransitionTime":"2026-02-01T06:44:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.039344 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.039377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.039388 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.039403 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.039411 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.143663 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.143810 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.143930 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.143994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.144079 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.246045 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.246500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.246584 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.246654 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.246710 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.350879 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.350915 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.350924 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.350939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.350952 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.452889 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.452939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.452969 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.452989 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.452999 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.555174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.555216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.555230 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.555267 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.555277 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.653902 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:08 crc kubenswrapper[4546]: E0201 06:44:08.654040 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.657342 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.657377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.657386 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.657399 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.657409 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.662640 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:28:09.070735606 +0000 UTC Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.759215 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.759247 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.759257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.759268 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.759280 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.861688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.861735 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.861746 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.861764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.861777 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.963567 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.963614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.963626 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.963646 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:08 crc kubenswrapper[4546]: I0201 06:44:08.963658 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:08Z","lastTransitionTime":"2026-02-01T06:44:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.066257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.066290 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.066301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.066316 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.066329 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.168806 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.168849 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.168883 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.168908 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.168920 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.271166 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.271218 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.271237 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.271253 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.271265 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.372522 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.372562 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.372572 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.372583 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.372591 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.474099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.474143 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.474153 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.474173 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.474187 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.576545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.576588 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.576599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.576612 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.576620 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.654520 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:09 crc kubenswrapper[4546]: E0201 06:44:09.654645 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.654810 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:09 crc kubenswrapper[4546]: E0201 06:44:09.654907 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.655002 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:09 crc kubenswrapper[4546]: E0201 06:44:09.655159 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.663739 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 23:51:43.994309003 +0000 UTC Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.667563 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678288 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678326 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678359 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678381 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.678248 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.688717 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.696489 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.703896 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.715476 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.730111 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:43:57.321129 6454 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:43:57.321591 6454 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:43:57.321638 6454 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:57.321765 6454 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:57.321830 6454 factory.go:656] Stopping watch factory\\\\nI0201 06:43:57.321844 6454 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:57.322001 6454 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:43:57.379267 6454 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0201 06:43:57.379284 6454 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0201 06:43:57.379350 6454 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:57.379379 6454 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0201 06:43:57.379449 6454 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.739783 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.749605 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.758678 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.765385 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.778504 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.779812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.779845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.779870 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.779884 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.779893 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.786976 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.794383 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.801493 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.810578 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.817189 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.882417 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.882456 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.882466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.882479 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.882492 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.984161 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.984197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.984209 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.984221 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:09 crc kubenswrapper[4546]: I0201 06:44:09.984230 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:09Z","lastTransitionTime":"2026-02-01T06:44:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.086118 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.086144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.086153 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.086167 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.086179 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.188727 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.188764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.188773 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.188792 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.188803 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.291545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.291586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.291599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.291617 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.291629 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.393732 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.393771 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.393782 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.393799 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.393809 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.496030 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.496059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.496070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.496084 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.496095 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.598226 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.598270 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.598281 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.598297 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.598309 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.653921 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:10 crc kubenswrapper[4546]: E0201 06:44:10.654061 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.664151 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 16:37:21.49659437 +0000 UTC Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.700223 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.700246 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.700254 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.700264 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.700272 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.801673 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.801695 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.801704 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.801715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.801724 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.903783 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.903813 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.903820 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.903838 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:10 crc kubenswrapper[4546]: I0201 06:44:10.903848 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:10Z","lastTransitionTime":"2026-02-01T06:44:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.005990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.006110 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.006128 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.006139 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.006149 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.107944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.107975 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.107984 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.107997 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.108007 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.210117 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.210147 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.210158 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.210169 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.210177 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.312044 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.312066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.312074 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.312083 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.312091 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.413932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.413949 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.413957 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.413968 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.413976 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.464517 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.464598 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.464623 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.464684 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:45:15.464661313 +0000 UTC m=+146.115597330 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.464696 4546 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.464715 4546 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.464759 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:45:15.464746433 +0000 UTC m=+146.115682449 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.464774 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:45:15.464767343 +0000 UTC m=+146.115703359 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.516309 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.516337 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.516345 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.516353 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.516362 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.566084 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.566119 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566177 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566194 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566204 4546 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566231 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:45:15.566223778 +0000 UTC m=+146.217159794 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566237 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566251 4546 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566260 4546 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.566287 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:45:15.566279654 +0000 UTC m=+146.217215660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.617586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.617619 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.617629 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.617641 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.617649 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.654840 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.655082 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.655120 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.655122 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.655362 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.655425 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.655641 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:44:11 crc kubenswrapper[4546]: E0201 06:44:11.655765 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.664333 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:54:35.823091193 +0000 UTC Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.719907 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.719940 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.719950 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.719961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.719969 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.822266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.822301 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.822310 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.822322 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.822331 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.924390 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.924510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.924586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.924671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:11 crc kubenswrapper[4546]: I0201 06:44:11.924735 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:11Z","lastTransitionTime":"2026-02-01T06:44:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.026751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.026789 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.026800 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.026825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.026837 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.129268 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.129326 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.129339 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.129363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.129376 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.231706 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.231739 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.231748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.231765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.231777 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.235598 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.235639 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.235651 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.235665 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.235676 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.245146 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.248606 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.248656 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.248671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.248693 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.248707 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.258583 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.261343 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.261380 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.261391 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.261426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.261438 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.270316 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.273446 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.273542 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.273619 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.273694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.273750 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.283649 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.286813 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.286844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.286874 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.286891 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.286901 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.295292 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:12Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.295397 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.334238 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.334354 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.334436 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.334505 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.334589 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.436887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.436979 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.437050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.437113 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.437174 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.538976 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.539009 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.539019 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.539033 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.539043 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.640361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.640390 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.640399 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.640411 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.640420 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.654590 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:12 crc kubenswrapper[4546]: E0201 06:44:12.654697 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.664779 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:19:35.674655626 +0000 UTC Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.742496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.742532 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.742543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.742552 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.742561 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.844018 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.844138 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.844150 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.844162 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.844171 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.945959 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.945990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.945998 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.946010 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:12 crc kubenswrapper[4546]: I0201 06:44:12.946018 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:12Z","lastTransitionTime":"2026-02-01T06:44:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.047715 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.047740 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.047751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.047762 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.047769 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.149717 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.149748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.149757 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.149770 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.149779 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.251901 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.251929 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.251938 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.251947 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.251953 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.353599 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.353651 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.353663 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.353683 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.353697 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.455500 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.455567 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.455578 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.455611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.455626 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.558103 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.558165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.558175 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.558191 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.558204 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.654923 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.654993 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:13 crc kubenswrapper[4546]: E0201 06:44:13.655103 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.655153 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:13 crc kubenswrapper[4546]: E0201 06:44:13.655171 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:13 crc kubenswrapper[4546]: E0201 06:44:13.655307 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.661217 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.661317 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.661377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.661462 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.661519 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.665086 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 10:28:20.48772157 +0000 UTC Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.765593 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.765622 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.765634 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.765648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.765659 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.867209 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.867313 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.867371 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.867430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.867490 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.969292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.969411 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.969472 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.969542 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:13 crc kubenswrapper[4546]: I0201 06:44:13.969600 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:13Z","lastTransitionTime":"2026-02-01T06:44:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.071537 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.071572 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.071585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.071597 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.071607 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.173017 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.173070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.173079 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.173102 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.173115 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.275213 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.275277 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.275287 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.275298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.275306 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.377331 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.377355 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.377363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.377372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.377398 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.478964 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.478993 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.479003 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.479014 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.479022 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.580657 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.580718 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.580732 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.580744 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.580753 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.654759 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:14 crc kubenswrapper[4546]: E0201 06:44:14.654896 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.665878 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 15:05:42.134326156 +0000 UTC Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.682203 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.682228 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.682237 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.682247 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.682258 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.784485 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.784514 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.784531 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.784545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.784555 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.886263 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.886293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.886302 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.886315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.886325 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.988027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.988064 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.988073 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.988086 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:14 crc kubenswrapper[4546]: I0201 06:44:14.988095 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:14Z","lastTransitionTime":"2026-02-01T06:44:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.089832 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.089895 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.089909 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.089922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.089933 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.191593 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.191623 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.191632 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.191643 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.191654 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.293442 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.293463 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.293471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.293480 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.293490 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.395023 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.395049 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.395059 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.395069 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.395099 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.496148 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.496172 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.496187 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.496216 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.496224 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.597653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.597793 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.597896 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.597961 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.598012 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.654601 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.654621 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.654641 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:15 crc kubenswrapper[4546]: E0201 06:44:15.654693 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:15 crc kubenswrapper[4546]: E0201 06:44:15.654898 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:15 crc kubenswrapper[4546]: E0201 06:44:15.654922 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.666456 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:56:19.868538631 +0000 UTC Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.699658 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.699680 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.699690 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.699699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.699708 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.801483 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.801507 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.801515 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.801532 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.801542 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.903140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.903174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.903185 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.903199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:15 crc kubenswrapper[4546]: I0201 06:44:15.903210 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:15Z","lastTransitionTime":"2026-02-01T06:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.004700 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.004729 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.004738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.004750 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.004760 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.106248 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.106283 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.106292 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.106306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.106316 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.207675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.207699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.207707 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.207716 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.207723 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.309756 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.309786 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.309796 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.309809 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.309817 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.411293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.411334 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.411343 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.411358 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.411366 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.513152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.513188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.513198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.513212 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.513221 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.615314 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.615343 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.615351 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.615361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.615369 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.654047 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:16 crc kubenswrapper[4546]: E0201 06:44:16.654152 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.666657 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 17:00:50.905343985 +0000 UTC Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.717149 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.717171 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.717179 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.717189 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.717199 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.819163 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.819232 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.819243 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.819257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.819267 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.920555 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.920586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.920594 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.920605 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:16 crc kubenswrapper[4546]: I0201 06:44:16.920613 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:16Z","lastTransitionTime":"2026-02-01T06:44:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.023141 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.023207 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.023220 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.023246 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.023259 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.125112 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.125140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.125149 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.125159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.125167 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.226136 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.226162 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.226174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.226186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.226196 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.327790 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.327815 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.327825 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.327837 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.327845 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.429514 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.429550 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.429574 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.429589 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.429600 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.531503 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.531543 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.531551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.531560 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.531567 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.633119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.633152 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.633164 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.633176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.633185 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.654613 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:17 crc kubenswrapper[4546]: E0201 06:44:17.654702 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.654764 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:17 crc kubenswrapper[4546]: E0201 06:44:17.654807 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.655003 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:17 crc kubenswrapper[4546]: E0201 06:44:17.655176 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.663466 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.667318 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:50:31.744687025 +0000 UTC Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.734498 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.734534 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.734544 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.734554 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.734561 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.836337 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.836368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.836377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.836389 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.836399 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.938423 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.938467 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.938478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.938496 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:17 crc kubenswrapper[4546]: I0201 06:44:17.938507 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:17Z","lastTransitionTime":"2026-02-01T06:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.040427 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.040462 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.040471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.040484 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.040494 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.142042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.142178 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.142243 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.142312 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.142381 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.243842 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.243905 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.243916 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.243931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.243940 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.345554 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.345585 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.345594 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.345606 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.345614 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.447009 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.447042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.447051 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.447064 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.447073 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.548794 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.548844 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.548867 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.548881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.548891 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.650736 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.650765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.650773 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.650783 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.650791 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.654267 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:18 crc kubenswrapper[4546]: E0201 06:44:18.654347 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.667662 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:15:55.425121066 +0000 UTC Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.751731 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.751758 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.751767 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.751780 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.751798 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.856383 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.856553 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.856634 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.856696 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.856771 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.958147 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.958176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.958185 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.958195 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:18 crc kubenswrapper[4546]: I0201 06:44:18.958203 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:18Z","lastTransitionTime":"2026-02-01T06:44:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.059764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.059792 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.059805 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.059821 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.059831 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.161922 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.161954 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.161963 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.161977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.161985 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.263448 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.263473 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.263482 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.263492 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.263499 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.364977 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.365006 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.365015 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.365027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.365038 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.466881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.466900 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.466908 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.466917 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.466925 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.568073 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.568097 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.568106 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.568116 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.568125 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.654576 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:19 crc kubenswrapper[4546]: E0201 06:44:19.654657 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.654902 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:19 crc kubenswrapper[4546]: E0201 06:44:19.655106 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.654932 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:19 crc kubenswrapper[4546]: E0201 06:44:19.655340 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.666925 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"218c5efd-c97f-48e1-883f-ec381e0a559b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b032df2294f5a5faf6b9a59d84c71de1567b1bf7e7b628b73f5449954b4df8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b686ddaa9d4f80eedac4fdee91076781daa6709672baf214c5d44b3ebf148bd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://045950bbaf81aeb99896d0777c3399714c11a1321b8558732df21fdcfe4b61d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efdd0a7695f404805c0626c5094de56357af21f3dd05d1438b71d8b06eaa9497\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://133d301abbe9a20519d68a3eb155498393d4bac75a0d4f4c0d69e1e59b5a1a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7859464bf6843144bd15d454f5b75907254e07f48d9415f25e7bef56af5eaab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9f663da09804d56820aefbd721f155b47be5a208c562d5f591c535609f0863c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zh4jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mj5bf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.667928 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:12:02.955477258 +0000 UTC Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.670352 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.670381 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.670391 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.670404 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.670412 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.678408 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fxcn7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62d4004d-9bf8-4b57-9193-4a8ad5aa3977\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b6ad79c26009b49c3f7e97914b6d1daf5e473601b0d0aa750497b7b2c51fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b5pw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:15Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fxcn7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.689251 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89c8ff88-ae22-40a1-b11d-8288582e08c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"01 06:43:07.387265 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:43:07.387268 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:43:07.387270 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 06:43:07.387516 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0201 06:43:07.393669 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1769928171\\\\\\\\\\\\\\\" (2026-02-01 06:42:50 +0000 UTC to 2026-03-03 06:42:51 +0000 UTC (now=2026-02-01 06:43:07.393626323 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393775 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1769928182\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1769928181\\\\\\\\\\\\\\\" (2026-02-01 05:43:01 +0000 UTC to 2027-02-01 05:43:01 +0000 UTC (now=2026-02-01 06:43:07.393757747 +0000 UTC))\\\\\\\"\\\\nI0201 06:43:07.393795 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI0201 06:43:07.393817 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0201 06:43:07.393838 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393882 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0201 06:43:07.393911 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4071431724/tls.crt::/tmp/serving-cert-4071431724/tls.key\\\\\\\"\\\\nI0201 06:43:07.393999 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nF0201 06:43:07.395992 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.697802 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c936e9e0-fc69-40d0-bc70-2cbc57ac38ea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03a9a218928889a5c19a12c20dd448480df7077f54de2167deae4cb249056eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df932ddd6c734a2cf4514329d6c1ec54c3fb694c13a0e53830e9afe9fd7c20a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80f532c7d58df2971c118300bb99de56fb8f45572c980f05ee1c3882b536c7e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87ffa831dc7f287dfd54e6dad3f8cb7cefce31674030ed25b8a55d68d845d1c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.706627 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://614dd6455f568766222f2717cdc2b25b79edc2e93cbd2678cc5a0ceeb760543a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e1c58dd01adddb3875eef587399621b8083be5bb7af785b6397ab5f665428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.715626 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9032e2c3-caef-4e24-95a3-2d67a9a1e8c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3d7561862303636fc0833afc34c81f79fe21677d4afb47827a6d8f3f4bcf75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca6b483f454c4f25c6c681267addb8f5f515e3891e1005d2594426172932e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fcbb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-z487m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.725601 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4gpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"73cf3878-2b3f-4ac6-b698-c86ac72baa90\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://812fa346a907e5dbca95a24d244e6ffce7f9dcbe9c7a9282c6a9e3fddeb3de47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4xk5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4gpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.734092 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a4316448-1833-40f9-bdd7-e13d7dd4da6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90bfcb0bb36068f0f56829cf875a3b6a9a4a262d5bc1cca1ae7a0c64fd5d9411\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxv6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dwtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.744077 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nwmnb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:55Z\\\",\\\"message\\\":\\\"2026-02-01T06:43:10+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7\\\\n2026-02-01T06:43:10+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_dcd95ded-b885-4c6e-8edc-21659d3e54c7 to /host/opt/cni/bin/\\\\n2026-02-01T06:43:10Z [verbose] multus-daemon started\\\\n2026-02-01T06:43:10Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:43:55Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wfgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nwmnb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.757536 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:43:57Z\\\",\\\"message\\\":\\\"tor.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:43:57.321129 6454 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:43:57.321591 6454 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:43:57.321638 6454 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:43:57.321765 6454 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:43:57.321830 6454 factory.go:656] Stopping watch factory\\\\nI0201 06:43:57.321844 6454 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:43:57.322001 6454 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:43:57.379267 6454 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0201 06:43:57.379284 6454 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0201 06:43:57.379350 6454 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:43:57.379379 6454 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0201 06:43:57.379449 6454 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2g65h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4klz2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.764791 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afadeb36-c7c3-47bd-96be-098424b7ea35\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9ca109332035cb3553f13bf64fcd53687b226c671d48b29ef934739a900664a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87865a25eb5c15a9aa3c39a76bedd450341862f78796be8dafcaf2547641b7d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87865a25eb5c15a9aa3c39a76bedd450341862f78796be8dafcaf2547641b7d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:42:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.771674 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.771702 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.771712 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.771724 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.771730 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.774368 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e118176-d507-4ecc-b403-7b48cc1d6e15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:42:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0378484e69bfb56c5ad12c3797473d6e4ee4b464c547518959abe3b92372511e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a314f2a48dbfc55ffcff36dd64e583f31c9a3ce378b21f3de30bd23e90c4d7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b923cfc572ed1140e1f91a5a0b31cdb24b8d446c3c9e813f06bf8d7a9ad705d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:42:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.783479 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://509a74f3aefd9a946477e8dcba05dbc89d49f3054b46877a067373b38eb59a84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.791053 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ac207c52ca1416ef7fbf67c0b736b3acf68398b22c86a18eb70a4d21e3c6605\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:43:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.797905 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8tdck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9zzv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:43:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8tdck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.806086 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.814227 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.822188 4546 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:43:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.873586 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.873611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.873621 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.873636 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.873645 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.975772 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.975802 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.975810 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.975823 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:19 crc kubenswrapper[4546]: I0201 06:44:19.975832 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:19Z","lastTransitionTime":"2026-02-01T06:44:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.077918 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.077945 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.077955 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.077970 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.077979 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.179551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.179579 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.179588 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.179597 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.179604 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.281236 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.281266 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.281293 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.281305 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.281313 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.382692 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.382851 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.382932 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.382999 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.383071 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.485079 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.485188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.485251 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.485306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.485367 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.586758 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.586785 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.586793 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.586806 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.586815 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.654442 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:20 crc kubenswrapper[4546]: E0201 06:44:20.654548 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.668484 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 04:03:42.517990162 +0000 UTC Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.688973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.689023 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.689035 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.689046 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.689055 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.790736 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.790756 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.790764 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.790774 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.790780 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.892008 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.892056 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.892072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.892088 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.892100 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.993621 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.993650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.993658 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.993668 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:20 crc kubenswrapper[4546]: I0201 06:44:20.993675 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:20Z","lastTransitionTime":"2026-02-01T06:44:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.095627 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.095664 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.095678 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.095694 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.095704 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.197156 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.197181 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.197189 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.197198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.197206 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.299146 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.299176 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.299186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.299197 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.299208 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.401162 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.401191 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.401199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.401211 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.401220 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.502724 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.502751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.502759 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.502769 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.502777 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.604194 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.604230 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.604238 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.604252 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.604261 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.653961 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.653982 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:21 crc kubenswrapper[4546]: E0201 06:44:21.654059 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.654109 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:21 crc kubenswrapper[4546]: E0201 06:44:21.654156 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:21 crc kubenswrapper[4546]: E0201 06:44:21.654310 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.669020 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 07:42:25.549120488 +0000 UTC Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.705577 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.705596 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.705604 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.705614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.705620 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.807188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.807235 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.807247 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.807270 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.807285 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.908799 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.908826 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.908836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.908845 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:21 crc kubenswrapper[4546]: I0201 06:44:21.908874 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:21Z","lastTransitionTime":"2026-02-01T06:44:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.014187 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.014211 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.014221 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.014233 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.014241 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.116651 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.116691 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.116702 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.116718 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.116730 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.219048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.219087 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.219095 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.219106 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.219116 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.321139 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.321171 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.321180 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.321194 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.321201 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.423092 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.423119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.423127 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.423143 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.423152 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.522693 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.522753 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.522768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.522791 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.522803 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.532409 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.535350 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.535374 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.535382 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.535393 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.535401 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.543711 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.546246 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.546274 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.546283 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.546294 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.546302 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.555307 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.558078 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.558119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.558129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.558144 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.558156 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.567335 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.569640 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.569686 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.569699 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.569718 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.569730 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.579554 4546 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:44:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fa428684-6fb6-45d8-b94c-216375fbfbe8\\\",\\\"systemUUID\\\":\\\"9a98126f-f656-4047-9b34-a8185f08b8ca\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:44:22Z is after 2025-08-24T17:21:41Z" Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.579664 4546 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.580587 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.580614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.580623 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.580639 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.580649 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.654696 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:22 crc kubenswrapper[4546]: E0201 06:44:22.654798 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.669221 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:23:21.960687595 +0000 UTC Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.682480 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.682510 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.682526 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.682538 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.682547 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.784163 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.784198 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.784208 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.784220 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.784228 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.886050 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.886076 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.886085 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.886099 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.886111 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.987596 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.987624 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.987635 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.987645 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:22 crc kubenswrapper[4546]: I0201 06:44:22.987654 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:22Z","lastTransitionTime":"2026-02-01T06:44:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.088945 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.088979 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.088988 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.089006 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.089015 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.191012 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.191066 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.191074 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.191087 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.191097 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.292588 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.292611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.292620 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.292633 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.292641 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.394368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.394398 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.394408 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.394418 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.394427 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.496414 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.496446 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.496456 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.496471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.496481 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.598376 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.598408 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.598417 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.598430 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.598438 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.654139 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.654178 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:23 crc kubenswrapper[4546]: E0201 06:44:23.654237 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.654146 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:23 crc kubenswrapper[4546]: E0201 06:44:23.654361 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:23 crc kubenswrapper[4546]: E0201 06:44:23.654461 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.670310 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:20:59.388593697 +0000 UTC Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.699711 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.699743 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.699755 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.699768 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.699778 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.802141 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.802170 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.802178 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.802188 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.802196 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.903690 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.903738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.903748 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.903760 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:23 crc kubenswrapper[4546]: I0201 06:44:23.903771 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:23Z","lastTransitionTime":"2026-02-01T06:44:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.005493 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.005541 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.005551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.005565 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.005576 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.106913 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.107058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.107140 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.107210 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.107270 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.210435 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.210506 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.210531 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.210563 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.210585 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.313287 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.313338 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.313355 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.313374 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.313388 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.416023 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.416058 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.416067 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.416082 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.416095 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.518270 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.518304 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.518312 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.518326 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.518338 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.620185 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.620240 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.620251 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.620267 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.620279 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.654555 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:24 crc kubenswrapper[4546]: E0201 06:44:24.654988 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.670713 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:51:56.02671633 +0000 UTC Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.722688 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.722719 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.722730 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.722742 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.722752 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.824274 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.824308 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.824316 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.824333 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.824345 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.925760 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.925803 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.925812 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.925822 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:24 crc kubenswrapper[4546]: I0201 06:44:24.925830 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:24Z","lastTransitionTime":"2026-02-01T06:44:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.028126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.028154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.028166 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.028182 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.028191 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.130105 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.130127 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.130134 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.130145 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.130152 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.231877 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.231926 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.231954 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.231973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.231981 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.334013 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.334042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.334072 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.334082 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.334090 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.435428 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.435476 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.435486 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.435499 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.435507 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.537083 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.537109 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.537117 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.537129 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.537140 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.638728 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.638784 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.638794 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.638808 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.638820 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.653998 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:25 crc kubenswrapper[4546]: E0201 06:44:25.654101 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.654234 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:25 crc kubenswrapper[4546]: E0201 06:44:25.654282 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.654359 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:25 crc kubenswrapper[4546]: E0201 06:44:25.654453 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.668552 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.671817 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 23:16:03.966566375 +0000 UTC Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.740512 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.740614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.740675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.740731 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.740788 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.842606 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.842648 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.842659 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.842673 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.842685 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.944637 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.944778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.944881 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.944944 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:25 crc kubenswrapper[4546]: I0201 06:44:25.944994 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:25Z","lastTransitionTime":"2026-02-01T06:44:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.046711 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.046790 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.046800 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.046811 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.046820 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.148039 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.148199 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.148282 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.148361 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.148436 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.249956 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.249990 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.250003 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.250016 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.250023 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.352073 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.352103 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.352113 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.352125 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.352134 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.388701 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:26 crc kubenswrapper[4546]: E0201 06:44:26.388964 4546 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:44:26 crc kubenswrapper[4546]: E0201 06:44:26.389078 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs podName:1ca3c024-0f0b-4651-8eb7-9a7e0511739c nodeName:}" failed. No retries permitted until 2026-02-01 06:45:30.389046947 +0000 UTC m=+161.039982993 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs") pod "network-metrics-daemon-8tdck" (UID: "1ca3c024-0f0b-4651-8eb7-9a7e0511739c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.454306 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.454365 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.454377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.454399 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.454413 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.556338 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.556436 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.556449 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.556471 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.556487 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.654473 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:26 crc kubenswrapper[4546]: E0201 06:44:26.654578 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.655085 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:44:26 crc kubenswrapper[4546]: E0201 06:44:26.655211 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4klz2_openshift-ovn-kubernetes(d4014c65-cdc3-4e2d-a7c3-2ac94248d488)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.657931 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.657960 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.657969 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.657981 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.657992 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.672466 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 15:11:15.607348774 +0000 UTC Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.760937 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.760960 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.760973 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.760986 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.760993 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.862627 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.862653 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.862663 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.862674 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.862681 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.964751 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.964778 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.964787 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.964801 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:26 crc kubenswrapper[4546]: I0201 06:44:26.964812 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:26Z","lastTransitionTime":"2026-02-01T06:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.066884 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.066913 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.066923 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.066935 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.066945 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.168572 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.168605 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.168614 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.168626 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.168651 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.270541 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.270569 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.270578 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.270591 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.270600 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.372703 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.372736 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.372744 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.372753 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.372760 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.474800 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.474831 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.474841 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.474852 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.474878 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.576489 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.576535 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.576546 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.576559 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.576569 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.654402 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.654570 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:27 crc kubenswrapper[4546]: E0201 06:44:27.654580 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.654653 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:27 crc kubenswrapper[4546]: E0201 06:44:27.654902 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:27 crc kubenswrapper[4546]: E0201 06:44:27.654943 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.673484 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:34:21.925876558 +0000 UTC Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.678673 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.678718 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.678728 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.678738 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.678747 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.780551 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.780650 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.780725 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.780785 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.780849 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.882897 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.883021 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.883107 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.883177 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.883250 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.984985 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.985026 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.985036 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.985055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:27 crc kubenswrapper[4546]: I0201 06:44:27.985066 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:27Z","lastTransitionTime":"2026-02-01T06:44:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.087629 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.087661 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.087671 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.087685 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.087716 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.189444 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.189490 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.189503 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.189530 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.189542 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.291409 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.291442 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.291473 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.291485 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.291493 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.393600 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.393645 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.393655 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.393669 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.393680 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.495057 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.495100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.495108 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.495127 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.495142 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.597228 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.597289 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.597298 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.597315 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.597325 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.653825 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:28 crc kubenswrapper[4546]: E0201 06:44:28.653990 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.674001 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 15:38:45.337491162 +0000 UTC Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.700005 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.700043 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.700055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.700070 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.700081 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.801548 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.801568 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.801576 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.801588 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.801595 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.903806 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.903877 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.903899 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.903920 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:28 crc kubenswrapper[4546]: I0201 06:44:28.903939 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:28Z","lastTransitionTime":"2026-02-01T06:44:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.006296 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.006386 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.006397 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.006412 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.006422 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.107735 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.107769 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.107777 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.107789 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.107798 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.209940 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.210016 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.210027 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.210042 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.210053 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.311631 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.311774 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.311836 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.311926 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.311992 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.414324 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.414359 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.414368 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.414379 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.414387 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.515615 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.515652 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.515662 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.515675 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.515688 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.617154 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.617186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.617195 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.617210 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.617223 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.654638 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.654664 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:29 crc kubenswrapper[4546]: E0201 06:44:29.654725 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:29 crc kubenswrapper[4546]: E0201 06:44:29.654947 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.654973 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:29 crc kubenswrapper[4546]: E0201 06:44:29.655031 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.674375 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 22:12:54.190476712 +0000 UTC Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.686778 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.686746672 podStartE2EDuration="53.686746672s" podCreationTimestamp="2026-02-01 06:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.675297593 +0000 UTC m=+100.326233609" watchObservedRunningTime="2026-02-01 06:44:29.686746672 +0000 UTC m=+100.337682689" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.708338 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.708322904 podStartE2EDuration="1m22.708322904s" podCreationTimestamp="2026-02-01 06:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.708214329 +0000 UTC m=+100.359150335" watchObservedRunningTime="2026-02-01 06:44:29.708322904 +0000 UTC m=+100.359258920" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.708503 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-z487m" podStartSLOduration=80.708498153 podStartE2EDuration="1m20.708498153s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.695423242 +0000 UTC m=+100.346359247" watchObservedRunningTime="2026-02-01 06:44:29.708498153 +0000 UTC m=+100.359434170" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.718997 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.719196 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.719282 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.719372 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.719464 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.722162 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.722149951 podStartE2EDuration="1m22.722149951s" podCreationTimestamp="2026-02-01 06:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.71920193 +0000 UTC m=+100.370137936" watchObservedRunningTime="2026-02-01 06:44:29.722149951 +0000 UTC m=+100.373085967" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.762600 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podStartSLOduration=81.762590366 podStartE2EDuration="1m21.762590366s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.76255498 +0000 UTC m=+100.413490995" watchObservedRunningTime="2026-02-01 06:44:29.762590366 +0000 UTC m=+100.413526382" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.762847 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c4gpz" podStartSLOduration=81.762842901 podStartE2EDuration="1m21.762842901s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.750770207 +0000 UTC m=+100.401706223" watchObservedRunningTime="2026-02-01 06:44:29.762842901 +0000 UTC m=+100.413778918" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.774559 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nwmnb" podStartSLOduration=81.774542573 podStartE2EDuration="1m21.774542573s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.774161346 +0000 UTC m=+100.425097352" watchObservedRunningTime="2026-02-01 06:44:29.774542573 +0000 UTC m=+100.425478589" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.800737 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.800727563 podStartE2EDuration="12.800727563s" podCreationTimestamp="2026-02-01 06:44:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.800504244 +0000 UTC m=+100.451440249" watchObservedRunningTime="2026-02-01 06:44:29.800727563 +0000 UTC m=+100.451663569" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.823077 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.823119 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.823142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.823165 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.823180 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.856260 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mj5bf" podStartSLOduration=81.856238507 podStartE2EDuration="1m21.856238507s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.855376444 +0000 UTC m=+100.506312461" watchObservedRunningTime="2026-02-01 06:44:29.856238507 +0000 UTC m=+100.507174514" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.863064 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fxcn7" podStartSLOduration=80.863045686 podStartE2EDuration="1m20.863045686s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.862546486 +0000 UTC m=+100.513482503" watchObservedRunningTime="2026-02-01 06:44:29.863045686 +0000 UTC m=+100.513981702" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.880241 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.880233514 podStartE2EDuration="4.880233514s" podCreationTimestamp="2026-02-01 06:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:29.879949859 +0000 UTC m=+100.530885875" watchObservedRunningTime="2026-02-01 06:44:29.880233514 +0000 UTC m=+100.531169520" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.926248 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.926330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.926342 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.926358 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:29 crc kubenswrapper[4546]: I0201 06:44:29.926370 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:29Z","lastTransitionTime":"2026-02-01T06:44:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.027630 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.027657 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.027665 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.027674 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.027683 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.129157 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.129186 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.129196 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.129209 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.129218 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.231435 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.231480 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.231492 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.231509 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.231530 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.333819 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.333850 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.333875 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.333887 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.333896 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.435638 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.435667 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.435677 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.435686 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.435695 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.537478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.537533 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.537545 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.537579 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.537593 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.639731 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.639784 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.639795 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.639805 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.639812 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.654285 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:30 crc kubenswrapper[4546]: E0201 06:44:30.654395 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.674499 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:02:24.593606596 +0000 UTC Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.742104 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.742146 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.742156 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.742174 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.742183 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.844089 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.844135 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.844148 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.844160 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.844169 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.945429 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.945456 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.945466 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.945493 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:30 crc kubenswrapper[4546]: I0201 06:44:30.945505 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:30Z","lastTransitionTime":"2026-02-01T06:44:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.048207 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.048257 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.048275 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.048284 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.048308 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.150100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.150123 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.150132 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.150142 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.150149 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.251321 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.251346 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.251354 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.251363 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.251371 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.354115 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.354151 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.354159 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.354171 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.354180 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.455994 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.456044 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.456055 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.456078 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.456094 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.557698 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.557749 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.557765 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.557777 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.557786 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.654978 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.655026 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.655062 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:31 crc kubenswrapper[4546]: E0201 06:44:31.655117 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:31 crc kubenswrapper[4546]: E0201 06:44:31.655388 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:31 crc kubenswrapper[4546]: E0201 06:44:31.655585 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.659330 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.659365 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.659378 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.659391 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.659400 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.674668 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:31:38.245550432 +0000 UTC Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.761405 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.761462 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.761475 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.761491 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.761503 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.864100 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.864769 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.864850 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.864964 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.865028 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.967804 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.967842 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.967852 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.967885 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:31 crc kubenswrapper[4546]: I0201 06:44:31.967897 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:31Z","lastTransitionTime":"2026-02-01T06:44:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.069821 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.069974 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.070048 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.070139 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.070201 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.172436 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.172490 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.172501 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.172520 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.172532 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.274637 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.274663 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.274674 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.274687 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.274697 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.376890 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.376927 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.376939 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.376953 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.376964 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.479478 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.479563 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.479581 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.479611 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.479630 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.582377 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.582418 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.582426 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.582444 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.582456 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.654189 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:32 crc kubenswrapper[4546]: E0201 06:44:32.654296 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.655051 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.655101 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.655115 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.655126 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.655135 4546 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:44:32Z","lastTransitionTime":"2026-02-01T06:44:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.675431 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:21:26.355503105 +0000 UTC Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.675486 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.682747 4546 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.693521 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h"] Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.693911 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.695922 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.696075 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.696115 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.696237 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.843166 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935db98e-644c-4533-aed3-75b4f979087b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.843212 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.843229 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/935db98e-644c-4533-aed3-75b4f979087b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.843273 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.843305 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935db98e-644c-4533-aed3-75b4f979087b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943790 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935db98e-644c-4533-aed3-75b4f979087b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943832 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935db98e-644c-4533-aed3-75b4f979087b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943875 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943894 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/935db98e-644c-4533-aed3-75b4f979087b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943925 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943985 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.943996 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/935db98e-644c-4533-aed3-75b4f979087b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.944715 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/935db98e-644c-4533-aed3-75b4f979087b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.950156 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/935db98e-644c-4533-aed3-75b4f979087b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:32 crc kubenswrapper[4546]: I0201 06:44:32.956685 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/935db98e-644c-4533-aed3-75b4f979087b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x7q4h\" (UID: \"935db98e-644c-4533-aed3-75b4f979087b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.007659 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.134713 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" event={"ID":"935db98e-644c-4533-aed3-75b4f979087b","Type":"ContainerStarted","Data":"ed024d2b13fa82c4326bbdaff392f5adb74018da125eedb0896b2b3250073bde"} Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.134786 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" event={"ID":"935db98e-644c-4533-aed3-75b4f979087b","Type":"ContainerStarted","Data":"4bd3404ab2e6cc4237b9a8e0c71012e2773626b19fb51d8d84375a712d69f51e"} Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.148983 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x7q4h" podStartSLOduration=85.148968554 podStartE2EDuration="1m25.148968554s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:33.148074171 +0000 UTC m=+103.799010187" watchObservedRunningTime="2026-02-01 06:44:33.148968554 +0000 UTC m=+103.799904570" Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.654453 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.654547 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:33 crc kubenswrapper[4546]: I0201 06:44:33.654568 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:33 crc kubenswrapper[4546]: E0201 06:44:33.654605 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:33 crc kubenswrapper[4546]: E0201 06:44:33.654729 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:33 crc kubenswrapper[4546]: E0201 06:44:33.654849 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:34 crc kubenswrapper[4546]: I0201 06:44:34.654064 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:34 crc kubenswrapper[4546]: E0201 06:44:34.654379 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:35 crc kubenswrapper[4546]: I0201 06:44:35.654017 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:35 crc kubenswrapper[4546]: E0201 06:44:35.654172 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:35 crc kubenswrapper[4546]: I0201 06:44:35.654191 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:35 crc kubenswrapper[4546]: I0201 06:44:35.654255 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:35 crc kubenswrapper[4546]: E0201 06:44:35.654366 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:35 crc kubenswrapper[4546]: E0201 06:44:35.654445 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:36 crc kubenswrapper[4546]: I0201 06:44:36.654756 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:36 crc kubenswrapper[4546]: E0201 06:44:36.654958 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:37 crc kubenswrapper[4546]: I0201 06:44:37.654531 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:37 crc kubenswrapper[4546]: I0201 06:44:37.654733 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:37 crc kubenswrapper[4546]: E0201 06:44:37.654773 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:37 crc kubenswrapper[4546]: I0201 06:44:37.654777 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:37 crc kubenswrapper[4546]: E0201 06:44:37.654981 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:37 crc kubenswrapper[4546]: E0201 06:44:37.655067 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:38 crc kubenswrapper[4546]: I0201 06:44:38.654032 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:38 crc kubenswrapper[4546]: E0201 06:44:38.654226 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:39 crc kubenswrapper[4546]: I0201 06:44:39.654095 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:39 crc kubenswrapper[4546]: I0201 06:44:39.654153 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:39 crc kubenswrapper[4546]: E0201 06:44:39.655037 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:39 crc kubenswrapper[4546]: I0201 06:44:39.655054 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:39 crc kubenswrapper[4546]: E0201 06:44:39.655350 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:39 crc kubenswrapper[4546]: E0201 06:44:39.655443 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:39 crc kubenswrapper[4546]: I0201 06:44:39.655731 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.154575 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/3.log" Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.157686 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerStarted","Data":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.158110 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.183595 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podStartSLOduration=92.183567445 podStartE2EDuration="1m32.183567445s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:40.181248538 +0000 UTC m=+110.832184554" watchObservedRunningTime="2026-02-01 06:44:40.183567445 +0000 UTC m=+110.834503461" Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.450471 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tdck"] Feb 01 06:44:40 crc kubenswrapper[4546]: I0201 06:44:40.450638 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:40 crc kubenswrapper[4546]: E0201 06:44:40.450744 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:41 crc kubenswrapper[4546]: I0201 06:44:41.654549 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:41 crc kubenswrapper[4546]: I0201 06:44:41.654626 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:41 crc kubenswrapper[4546]: I0201 06:44:41.654648 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:41 crc kubenswrapper[4546]: I0201 06:44:41.654622 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:41 crc kubenswrapper[4546]: E0201 06:44:41.654755 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:44:41 crc kubenswrapper[4546]: E0201 06:44:41.654908 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:44:41 crc kubenswrapper[4546]: E0201 06:44:41.655112 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8tdck" podUID="1ca3c024-0f0b-4651-8eb7-9a7e0511739c" Feb 01 06:44:41 crc kubenswrapper[4546]: E0201 06:44:41.655159 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.909240 4546 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.909661 4546 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.942788 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.943591 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.943735 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.944193 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.944632 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.945214 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.945230 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.945626 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.948944 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9n5vv"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.949412 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.950330 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.950563 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.950650 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.950896 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.955084 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.955408 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.956226 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.958501 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.960453 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.961068 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.961157 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.961556 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.963962 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.964326 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.964448 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.964677 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.965047 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.963976 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.965434 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.966941 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.969745 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.973367 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.986914 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987107 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987154 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987319 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987372 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987321 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987614 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987624 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.987821 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wfvhf"] Feb 01 06:44:42 crc kubenswrapper[4546]: W0201 06:44:42.987905 4546 reflector.go:561] object-"openshift-image-registry"/"trusted-ca": failed to list *v1.ConfigMap: configmaps "trusted-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Feb 01 06:44:42 crc kubenswrapper[4546]: E0201 06:44:42.987974 4546 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.988071 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.988309 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.988079 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.988146 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.988825 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qxwqf"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.989418 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.990452 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.991126 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.991518 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.991766 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.993890 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.994094 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.994324 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.994483 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.995203 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.995983 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.996986 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.997346 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc"] Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.997778 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.998628 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:42 crc kubenswrapper[4546]: I0201 06:44:42.999191 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:42.999980 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.000274 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.001009 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.001146 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.002411 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.003227 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.003434 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.003760 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.003920 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5jr6p"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.004492 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.006362 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.006917 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pk4cv"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.007252 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b4wcw"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.007694 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.007741 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.007945 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mcws5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.008343 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.008762 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.008906 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.014176 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.015362 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.015458 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.015686 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.016018 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.018578 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.029777 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.033360 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.030630 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.030644 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.030710 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.030833 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032176 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032335 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032497 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032565 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032641 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032681 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032710 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032823 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.032908 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.033481 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.038219 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.038260 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.038348 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040127 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040174 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040208 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040237 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040286 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.040589 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.041965 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.042017 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.053595 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.054902 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.054987 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.055019 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hqcww"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.055261 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.055317 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.055386 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.055772 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-85r82"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.056259 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.056582 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.057065 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.057403 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.057646 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.059730 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.060025 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.059461 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.063387 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.064942 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.065432 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.065457 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.066411 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.069880 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.070042 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.070219 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.070353 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.070599 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.070728 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.073197 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.073307 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.073985 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.074152 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.075913 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.076700 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.083004 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.083279 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.085199 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.085610 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.089233 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.090057 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.090375 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.090981 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.091464 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.091716 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.096036 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.096615 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hdscf"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097005 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097437 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097467 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097731 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097834 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.097883 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.114526 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.115598 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.118056 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mqnml"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.119283 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.120982 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.121497 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.121922 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.122224 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.122258 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.123931 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.125829 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.126452 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.128254 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.129379 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.129591 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.132263 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.133749 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137097 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137565 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137599 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a65f9c1-682b-4818-a663-19b9c5281d78-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137623 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjbc\" (UniqueName: \"kubernetes.io/projected/3a65f9c1-682b-4818-a663-19b9c5281d78-kube-api-access-prjbc\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137659 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cdce34f-3d94-4efb-b9eb-627ce9da7031-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137676 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137692 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wppkl\" (UniqueName: \"kubernetes.io/projected/18dbc0ae-24aa-4377-90b5-52cff1a5e855-kube-api-access-wppkl\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137709 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gk7\" (UniqueName: \"kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137729 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137746 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-encryption-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137761 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137776 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-audit-dir\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137795 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-stats-auth\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137812 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137826 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a65f9c1-682b-4818-a663-19b9c5281d78-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137845 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-auth-proxy-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137902 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7f8l\" (UniqueName: \"kubernetes.io/projected/75263970-db40-455a-8873-d1cea12d384b-kube-api-access-d7f8l\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137920 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkddl\" (UniqueName: \"kubernetes.io/projected/3c232787-4f08-451b-ab33-d78c86f00dc7-kube-api-access-tkddl\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137935 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-config\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137952 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137968 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.137990 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk27z\" (UniqueName: \"kubernetes.io/projected/07764668-24b4-4b55-ba97-eaf6d205d497-kube-api-access-vk27z\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138005 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138020 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138039 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-etcd-client\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138054 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138070 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cdce34f-3d94-4efb-b9eb-627ce9da7031-config\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138090 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138106 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138123 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138137 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138153 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-serving-cert\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138171 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138187 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138196 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t2phc"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138202 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138946 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-node-pullsecrets\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138976 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-image-import-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.138999 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8fcf426-a005-4459-a161-17905ef2f5ea-metrics-tls\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.139018 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-client\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.139039 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.139110 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.139342 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.140271 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.141523 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.142269 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df707cc-8a5d-437b-b822-4a7f2360c18d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.142326 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58mnx\" (UniqueName: \"kubernetes.io/projected/86b65b33-e838-40a0-84fa-e7c2a659cc1d-kube-api-access-58mnx\") pod \"downloads-7954f5f757-wfvhf\" (UID: \"86b65b33-e838-40a0-84fa-e7c2a659cc1d\") " pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.142351 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhflk\" (UniqueName: \"kubernetes.io/projected/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-kube-api-access-jhflk\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.143676 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144152 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qxwqf"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144703 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144738 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzvz8\" (UniqueName: \"kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144774 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-audit\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144794 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144815 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144833 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144876 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4251430-d927-4b5a-b0a2-a119c8109252-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144898 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144920 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4ph\" (UniqueName: \"kubernetes.io/projected/080cf935-686b-449c-8c11-6a3c19039b78-kube-api-access-qb4ph\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144939 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.144971 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145031 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145061 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df707cc-8a5d-437b-b822-4a7f2360c18d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145152 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-images\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145193 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07764668-24b4-4b55-ba97-eaf6d205d497-serving-cert\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145214 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145249 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-default-certificate\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145288 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-trusted-ca\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145308 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145332 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145373 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-config\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145393 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrd6\" (UniqueName: \"kubernetes.io/projected/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-kube-api-access-lsrd6\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145407 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tt45\" (UniqueName: \"kubernetes.io/projected/e8fcf426-a005-4459-a161-17905ef2f5ea-kube-api-access-2tt45\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145433 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-service-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145465 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccmx\" (UniqueName: \"kubernetes.io/projected/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-kube-api-access-9ccmx\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145485 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-etcd-serving-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145502 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145526 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145560 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8sr7\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-kube-api-access-m8sr7\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145580 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145597 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df707cc-8a5d-437b-b822-4a7f2360c18d-config\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145616 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64rjr\" (UniqueName: \"kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145633 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-serving-cert\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145657 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75263970-db40-455a-8873-d1cea12d384b-machine-approver-tls\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145675 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145709 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cdce34f-3d94-4efb-b9eb-627ce9da7031-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145724 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c232787-4f08-451b-ab33-d78c86f00dc7-service-ca-bundle\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145739 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-metrics-certs\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145756 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145770 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-config\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145786 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkwx8\" (UniqueName: \"kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145852 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.145998 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9n5vv"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.146965 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.147127 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.147952 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.148915 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7mzw5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.149953 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.150678 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5jr6p"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.151650 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.152816 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.153420 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.154232 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.155367 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.156667 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-85r82"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.157632 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.158432 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hqcww"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.159292 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b4wcw"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.160443 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.160991 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pk4cv"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.163348 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.163958 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.165236 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.166886 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.168081 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.169141 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.170493 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.170945 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hdscf"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.171997 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.173330 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.175146 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wfvhf"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.175981 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t2phc"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.178070 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.180261 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.180998 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mqnml"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.181721 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9rzq5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.182587 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mnvbs"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.182744 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.184104 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.184135 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.185650 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.196242 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mnvbs"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.197308 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9rzq5"] Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.205660 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.225442 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.245453 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246655 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-csi-data-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246694 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246744 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246767 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cdce34f-3d94-4efb-b9eb-627ce9da7031-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246789 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246850 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wppkl\" (UniqueName: \"kubernetes.io/projected/18dbc0ae-24aa-4377-90b5-52cff1a5e855-kube-api-access-wppkl\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.246995 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-encryption-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247017 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247063 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247082 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247103 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-config\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247141 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj7dr\" (UniqueName: \"kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247165 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk27z\" (UniqueName: \"kubernetes.io/projected/07764668-24b4-4b55-ba97-eaf6d205d497-kube-api-access-vk27z\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247182 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247201 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247223 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247240 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-mountpoint-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247261 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247277 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247295 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247312 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247329 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-node-pullsecrets\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247347 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-registration-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247366 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247383 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247401 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhflk\" (UniqueName: \"kubernetes.io/projected/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-kube-api-access-jhflk\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247426 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-plugins-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247442 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247461 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzvz8\" (UniqueName: \"kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247486 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-audit\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40441f6-397d-4546-b5ec-62c6e936be97-proxy-tls\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247532 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwbg\" (UniqueName: \"kubernetes.io/projected/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-kube-api-access-nbwbg\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247552 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247570 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247589 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4251430-d927-4b5a-b0a2-a119c8109252-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247606 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4ph\" (UniqueName: \"kubernetes.io/projected/080cf935-686b-449c-8c11-6a3c19039b78-kube-api-access-qb4ph\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247622 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-tmpfs\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247636 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247652 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247685 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-images\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247703 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwpd\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-kube-api-access-dfwpd\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247720 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-trusted-ca\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247741 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhdcq\" (UniqueName: \"kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247760 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-config\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247784 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccmx\" (UniqueName: \"kubernetes.io/projected/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-kube-api-access-9ccmx\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247803 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df707cc-8a5d-437b-b822-4a7f2360c18d-config\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247821 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-policies\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247839 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64rjr\" (UniqueName: \"kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247871 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247895 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247922 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-metrics-certs\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247941 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247959 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-proxy-tls\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.247981 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c232787-4f08-451b-ab33-d78c86f00dc7-service-ca-bundle\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248001 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-encryption-config\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248028 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjbc\" (UniqueName: \"kubernetes.io/projected/3a65f9c1-682b-4818-a663-19b9c5281d78-kube-api-access-prjbc\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248047 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248064 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-webhook-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248084 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24929dd-69fc-4c32-ae8a-65de2d609529-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248103 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a65f9c1-682b-4818-a663-19b9c5281d78-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248121 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-socket-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248140 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gk7\" (UniqueName: \"kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248160 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248177 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-audit-dir\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248194 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-stats-auth\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248212 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e5bbf75-17f7-4156-876c-8974e116f225-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248230 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkddl\" (UniqueName: \"kubernetes.io/projected/3c232787-4f08-451b-ab33-d78c86f00dc7-kube-api-access-tkddl\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248247 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a65f9c1-682b-4818-a663-19b9c5281d78-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248264 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248282 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-auth-proxy-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248300 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7f8l\" (UniqueName: \"kubernetes.io/projected/75263970-db40-455a-8873-d1cea12d384b-kube-api-access-d7f8l\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248318 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248336 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248355 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248376 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cdce34f-3d94-4efb-b9eb-627ce9da7031-config\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248411 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248427 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-etcd-client\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248444 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248462 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248479 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltp8\" (UniqueName: \"kubernetes.io/projected/e40441f6-397d-4546-b5ec-62c6e936be97-kube-api-access-8ltp8\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248498 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-serving-cert\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248520 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-apiservice-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248536 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfv4\" (UniqueName: \"kubernetes.io/projected/02ac825d-2a57-4917-9cd0-d8d058a8fb95-kube-api-access-ljfv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248554 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248571 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248587 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5bbf75-17f7-4156-876c-8974e116f225-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248604 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-image-import-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248621 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8fcf426-a005-4459-a161-17905ef2f5ea-metrics-tls\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248635 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-client\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248653 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df707cc-8a5d-437b-b822-4a7f2360c18d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248670 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58mnx\" (UniqueName: \"kubernetes.io/projected/86b65b33-e838-40a0-84fa-e7c2a659cc1d-kube-api-access-58mnx\") pod \"downloads-7954f5f757-wfvhf\" (UID: \"86b65b33-e838-40a0-84fa-e7c2a659cc1d\") " pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248688 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmszm\" (UniqueName: \"kubernetes.io/projected/1e5bbf75-17f7-4156-876c-8974e116f225-kube-api-access-gmszm\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248705 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j8s6\" (UniqueName: \"kubernetes.io/projected/c853a7cc-059d-4757-951b-e094ae75d27f-kube-api-access-8j8s6\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248722 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-images\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248739 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-serving-cert\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248755 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248773 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jllbk\" (UniqueName: \"kubernetes.io/projected/f24929dd-69fc-4c32-ae8a-65de2d609529-kube-api-access-jllbk\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248800 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248819 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248839 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df707cc-8a5d-437b-b822-4a7f2360c18d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248889 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-client\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248905 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-dir\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248922 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchv5\" (UniqueName: \"kubernetes.io/projected/7397ef95-4126-4f2e-9ba4-162440d6b87f-kube-api-access-nchv5\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248940 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07764668-24b4-4b55-ba97-eaf6d205d497-serving-cert\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248969 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.248987 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249004 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl92b\" (UniqueName: \"kubernetes.io/projected/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-kube-api-access-wl92b\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249023 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-default-certificate\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249040 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249057 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249074 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrd6\" (UniqueName: \"kubernetes.io/projected/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-kube-api-access-lsrd6\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249091 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tt45\" (UniqueName: \"kubernetes.io/projected/e8fcf426-a005-4459-a161-17905ef2f5ea-kube-api-access-2tt45\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249108 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249126 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-service-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249144 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89l2\" (UniqueName: \"kubernetes.io/projected/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-kube-api-access-c89l2\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249162 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249178 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24929dd-69fc-4c32-ae8a-65de2d609529-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249196 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-etcd-serving-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249212 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249231 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8sr7\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-kube-api-access-m8sr7\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249248 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249265 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j92s8\" (UniqueName: \"kubernetes.io/projected/45f3b96f-5161-47ae-a33b-8a895303ae28-kube-api-access-j92s8\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249284 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7956bd03-db5a-4524-85ac-184466ca0029-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249304 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7msq\" (UniqueName: \"kubernetes.io/projected/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-kube-api-access-s7msq\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249321 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249338 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249355 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-serving-cert\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249373 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75263970-db40-455a-8873-d1cea12d384b-machine-approver-tls\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249390 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249408 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cdce34f-3d94-4efb-b9eb-627ce9da7031-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249426 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249445 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-config\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249462 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7956bd03-db5a-4524-85ac-184466ca0029-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.249479 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkwx8\" (UniqueName: \"kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.250251 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.250632 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.251313 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.251825 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6df707cc-8a5d-437b-b822-4a7f2360c18d-config\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.252126 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.252327 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.252485 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.254930 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cdce34f-3d94-4efb-b9eb-627ce9da7031-config\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.255814 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.255921 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.255945 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-audit-dir\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.256625 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.256974 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/080cf935-686b-449c-8c11-6a3c19039b78-node-pullsecrets\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.257753 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-image-import-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.258367 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.258385 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a65f9c1-682b-4818-a663-19b9c5281d78-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.258483 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.258496 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-config\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.258901 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.259526 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.259740 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6df707cc-8a5d-437b-b822-4a7f2360c18d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.260017 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.260214 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.260611 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-encryption-config\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.260674 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.261115 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.261414 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-audit\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.261545 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a65f9c1-682b-4818-a663-19b9c5281d78-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.261661 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262192 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262572 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262652 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262725 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75263970-db40-455a-8873-d1cea12d384b-auth-proxy-config\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262945 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.262986 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-etcd-client\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.263183 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-client\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.263457 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-stats-auth\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.263771 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.264466 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-images\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.264683 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-service-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.265344 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-config\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.265459 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.265998 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/080cf935-686b-449c-8c11-6a3c19039b78-etcd-serving-ca\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.266035 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-config\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.266599 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c232787-4f08-451b-ab33-d78c86f00dc7-service-ca-bundle\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.266754 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/18dbc0ae-24aa-4377-90b5-52cff1a5e855-etcd-ca\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.267173 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07764668-24b4-4b55-ba97-eaf6d205d497-trusted-ca\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.267317 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.267332 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.267494 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18dbc0ae-24aa-4377-90b5-52cff1a5e855-serving-cert\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.268339 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/080cf935-686b-449c-8c11-6a3c19039b78-serving-cert\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.268516 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.268581 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270087 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cdce34f-3d94-4efb-b9eb-627ce9da7031-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270240 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270339 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-metrics-certs\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270516 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270695 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.270949 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75263970-db40-455a-8873-d1cea12d384b-machine-approver-tls\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.271302 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.271412 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8fcf426-a005-4459-a161-17905ef2f5ea-metrics-tls\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.271603 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07764668-24b4-4b55-ba97-eaf6d205d497-serving-cert\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.271612 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3c232787-4f08-451b-ab33-d78c86f00dc7-default-certificate\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.271638 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4251430-d927-4b5a-b0a2-a119c8109252-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.273451 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.273689 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.285979 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.306619 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.325580 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.346077 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350680 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-csi-data-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350753 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350784 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350809 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj7dr\" (UniqueName: \"kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350845 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-csi-data-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350888 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350919 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-mountpoint-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350939 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-registration-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.350989 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-plugins-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351043 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40441f6-397d-4546-b5ec-62c6e936be97-proxy-tls\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351047 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-mountpoint-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351064 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwbg\" (UniqueName: \"kubernetes.io/projected/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-kube-api-access-nbwbg\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351125 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-tmpfs\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351146 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351181 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwpd\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-kube-api-access-dfwpd\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351225 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhdcq\" (UniqueName: \"kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351188 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-registration-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351238 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-plugins-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351264 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-policies\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351318 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351337 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351379 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-proxy-tls\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351413 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-encryption-config\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351452 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24929dd-69fc-4c32-ae8a-65de2d609529-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351470 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-tmpfs\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351480 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-webhook-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351528 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-socket-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351559 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e5bbf75-17f7-4156-876c-8974e116f225-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351586 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351624 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351627 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7397ef95-4126-4f2e-9ba4-162440d6b87f-socket-dir\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351656 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351702 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltp8\" (UniqueName: \"kubernetes.io/projected/e40441f6-397d-4546-b5ec-62c6e936be97-kube-api-access-8ltp8\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351729 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-apiservice-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351765 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfv4\" (UniqueName: \"kubernetes.io/projected/02ac825d-2a57-4917-9cd0-d8d058a8fb95-kube-api-access-ljfv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351785 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351848 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5bbf75-17f7-4156-876c-8974e116f225-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmszm\" (UniqueName: \"kubernetes.io/projected/1e5bbf75-17f7-4156-876c-8974e116f225-kube-api-access-gmszm\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351936 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j8s6\" (UniqueName: \"kubernetes.io/projected/c853a7cc-059d-4757-951b-e094ae75d27f-kube-api-access-8j8s6\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351973 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-images\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.351990 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-serving-cert\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352017 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jllbk\" (UniqueName: \"kubernetes.io/projected/f24929dd-69fc-4c32-ae8a-65de2d609529-kube-api-access-jllbk\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352056 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nchv5\" (UniqueName: \"kubernetes.io/projected/7397ef95-4126-4f2e-9ba4-162440d6b87f-kube-api-access-nchv5\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352086 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-client\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352120 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-dir\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352139 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl92b\" (UniqueName: \"kubernetes.io/projected/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-kube-api-access-wl92b\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352160 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352172 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-dir\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352181 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352218 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89l2\" (UniqueName: \"kubernetes.io/projected/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-kube-api-access-c89l2\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352246 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352258 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24929dd-69fc-4c32-ae8a-65de2d609529-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352309 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j92s8\" (UniqueName: \"kubernetes.io/projected/45f3b96f-5161-47ae-a33b-8a895303ae28-kube-api-access-j92s8\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352329 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7956bd03-db5a-4524-85ac-184466ca0029-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352365 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7msq\" (UniqueName: \"kubernetes.io/projected/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-kube-api-access-s7msq\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352382 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352401 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352463 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352491 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7956bd03-db5a-4524-85ac-184466ca0029-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.352652 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.354404 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24929dd-69fc-4c32-ae8a-65de2d609529-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.365330 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.372225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24929dd-69fc-4c32-ae8a-65de2d609529-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.385137 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.405610 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.414069 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-encryption-config\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.426489 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.466404 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.475245 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-client\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.486290 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.494529 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-serving-cert\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.505215 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.526239 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.545708 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.552378 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.565967 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.572440 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-audit-policies\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.585059 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.592783 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.605247 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.639891 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.645288 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.654783 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.654843 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.655091 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.655184 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.665736 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.686581 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.705826 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.724838 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.745777 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.765903 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.785914 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.805990 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.815034 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e5bbf75-17f7-4156-876c-8974e116f225-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.825416 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.833476 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e5bbf75-17f7-4156-876c-8974e116f225-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.846000 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.872090 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.873836 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7956bd03-db5a-4524-85ac-184466ca0029-trusted-ca\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.885583 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.905469 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.916009 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7956bd03-db5a-4524-85ac-184466ca0029-metrics-tls\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.925065 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.945937 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.966288 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 06:44:43 crc kubenswrapper[4546]: I0201 06:44:43.985679 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.005298 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.025552 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.045407 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.054999 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-webhook-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.055055 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-apiservice-cert\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.065247 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.085484 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.104662 4546 request.go:700] Waited for 1.012760592s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.105768 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.125945 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.145135 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.165685 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.172476 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e40441f6-397d-4546-b5ec-62c6e936be97-images\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.185994 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.205030 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.225182 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.245950 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.255007 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e40441f6-397d-4546-b5ec-62c6e936be97-proxy-tls\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.262130 4546 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.262203 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca podName:b4251430-d927-4b5a-b0a2-a119c8109252 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.762181465 +0000 UTC m=+115.413117481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca") pod "cluster-image-registry-operator-dc59b4c8b-mznt2" (UID: "b4251430-d927-4b5a-b0a2-a119c8109252") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.267243 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.274396 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-proxy-tls\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.285789 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.305555 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.326547 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.345926 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351553 4546 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351703 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls podName:02ac825d-2a57-4917-9cd0-d8d058a8fb95 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.851685271 +0000 UTC m=+115.502621287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-jnks5" (UID: "02ac825d-2a57-4917-9cd0-d8d058a8fb95") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351785 4546 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351837 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics podName:ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.851826549 +0000 UTC m=+115.502762564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics") pod "marketplace-operator-79b997595-bs98t" (UID: "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351585 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352029 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert podName:45f3b96f-5161-47ae-a33b-8a895303ae28 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852015033 +0000 UTC m=+115.502951049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert") pod "olm-operator-6b444d44fb-n2974" (UID: "45f3b96f-5161-47ae-a33b-8a895303ae28") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351610 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352184 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert podName:45f3b96f-5161-47ae-a33b-8a895303ae28 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852174793 +0000 UTC m=+115.503110810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert") pod "olm-operator-6b444d44fb-n2974" (UID: "45f3b96f-5161-47ae-a33b-8a895303ae28") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.351633 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352287 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352261 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352394 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert podName:c853a7cc-059d-4757-951b-e094ae75d27f nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852383888 +0000 UTC m=+115.503319904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert") pod "catalog-operator-68c6474976-f4tq9" (UID: "c853a7cc-059d-4757-951b-e094ae75d27f") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352473 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume podName:0904ae3e-72bf-4b72-9c6b-734d840b9cf5 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.85246521 +0000 UTC m=+115.503401226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume") pod "collect-profiles-29498790-bxbrm" (UID: "0904ae3e-72bf-4b72-9c6b-734d840b9cf5") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352618 4546 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352695 4546 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352699 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert podName:a9fcb2d7-d6c8-49b4-9574-cad807b4310b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852632947 +0000 UTC m=+115.503568963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-fxg47" (UID: "a9fcb2d7-d6c8-49b4-9574-cad807b4310b") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352706 4546 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352939 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume podName:0904ae3e-72bf-4b72-9c6b-734d840b9cf5 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852835437 +0000 UTC m=+115.503771454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume") pod "collect-profiles-29498790-bxbrm" (UID: "0904ae3e-72bf-4b72-9c6b-734d840b9cf5") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.352997 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert podName:c853a7cc-059d-4757-951b-e094ae75d27f nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.852984699 +0000 UTC m=+115.503920715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert") pod "catalog-operator-68c6474976-f4tq9" (UID: "c853a7cc-059d-4757-951b-e094ae75d27f") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: E0201 06:44:44.353022 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca podName:ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:44.853013203 +0000 UTC m=+115.503949219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca") pod "marketplace-operator-79b997595-bs98t" (UID: "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.365173 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.385691 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.405320 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.426079 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.445269 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.465980 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.485417 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.506248 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.526197 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.545977 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.565503 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.585674 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.605228 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.631738 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.645710 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.665482 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.685981 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.705962 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.725547 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.745150 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.765631 4546 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.773531 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.786046 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.805885 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.826083 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.846094 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874377 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874475 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874611 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874647 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874730 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.874754 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.875397 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.875470 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.875543 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.875695 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.875695 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.877231 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.879295 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/02ac825d-2a57-4917-9cd0-d8d058a8fb95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.879971 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.879994 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.880021 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f3b96f-5161-47ae-a33b-8a895303ae28-srv-cert\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.879996 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.880572 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.880636 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-profile-collector-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.880742 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c853a7cc-059d-4757-951b-e094ae75d27f-srv-cert\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.886313 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.905096 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.925772 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.945097 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.965946 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 06:44:44 crc kubenswrapper[4546]: I0201 06:44:44.985436 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.005143 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.037933 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkwx8\" (UniqueName: \"kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8\") pod \"route-controller-manager-6576b87f9c-ntrd2\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.057776 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk27z\" (UniqueName: \"kubernetes.io/projected/07764668-24b4-4b55-ba97-eaf6d205d497-kube-api-access-vk27z\") pod \"console-operator-58897d9998-9n5vv\" (UID: \"07764668-24b4-4b55-ba97-eaf6d205d497\") " pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.077662 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cdce34f-3d94-4efb-b9eb-627ce9da7031-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tnrrk\" (UID: \"7cdce34f-3d94-4efb-b9eb-627ce9da7031\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.096820 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wppkl\" (UniqueName: \"kubernetes.io/projected/18dbc0ae-24aa-4377-90b5-52cff1a5e855-kube-api-access-wppkl\") pod \"etcd-operator-b45778765-qxwqf\" (UID: \"18dbc0ae-24aa-4377-90b5-52cff1a5e855\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.101743 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.118495 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccmx\" (UniqueName: \"kubernetes.io/projected/7e6c27ea-97c9-4f56-ad23-91cda30acf6b-kube-api-access-9ccmx\") pod \"cluster-samples-operator-665b6dd947-wnvc9\" (UID: \"7e6c27ea-97c9-4f56-ad23-91cda30acf6b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.124336 4546 request.go:700] Waited for 1.872311991s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/serviceaccounts/oauth-openshift/token Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.127552 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.139547 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64rjr\" (UniqueName: \"kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr\") pod \"oauth-openshift-558db77b4-9n59f\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.157566 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.159281 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58mnx\" (UniqueName: \"kubernetes.io/projected/86b65b33-e838-40a0-84fa-e7c2a659cc1d-kube-api-access-58mnx\") pod \"downloads-7954f5f757-wfvhf\" (UID: \"86b65b33-e838-40a0-84fa-e7c2a659cc1d\") " pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.161330 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.182148 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjbc\" (UniqueName: \"kubernetes.io/projected/3a65f9c1-682b-4818-a663-19b9c5281d78-kube-api-access-prjbc\") pod \"openshift-controller-manager-operator-756b6f6bc6-d4zzt\" (UID: \"3a65f9c1-682b-4818-a663-19b9c5281d78\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.186095 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.197209 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.203722 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.222961 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6df707cc-8a5d-437b-b822-4a7f2360c18d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vmpqc\" (UID: \"6df707cc-8a5d-437b-b822-4a7f2360c18d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.227398 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.239260 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.246213 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gk7\" (UniqueName: \"kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7\") pod \"controller-manager-879f6c89f-gzcwd\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.259873 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkddl\" (UniqueName: \"kubernetes.io/projected/3c232787-4f08-451b-ab33-d78c86f00dc7-kube-api-access-tkddl\") pod \"router-default-5444994796-mcws5\" (UID: \"3c232787-4f08-451b-ab33-d78c86f00dc7\") " pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.276901 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.285744 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhflk\" (UniqueName: \"kubernetes.io/projected/21610d9b-73c6-4b4c-bc13-032e6f2b0f3b-kube-api-access-jhflk\") pod \"openshift-config-operator-7777fb866f-xfkxw\" (UID: \"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.299590 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7f8l\" (UniqueName: \"kubernetes.io/projected/75263970-db40-455a-8873-d1cea12d384b-kube-api-access-d7f8l\") pod \"machine-approver-56656f9798-kv9mn\" (UID: \"75263970-db40-455a-8873-d1cea12d384b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.329470 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrd6\" (UniqueName: \"kubernetes.io/projected/cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7-kube-api-access-lsrd6\") pod \"machine-api-operator-5694c8668f-5jr6p\" (UID: \"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.337799 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.343842 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tt45\" (UniqueName: \"kubernetes.io/projected/e8fcf426-a005-4459-a161-17905ef2f5ea-kube-api-access-2tt45\") pod \"dns-operator-744455d44c-pk4cv\" (UID: \"e8fcf426-a005-4459-a161-17905ef2f5ea\") " pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.362599 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzvz8\" (UniqueName: \"kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8\") pod \"console-f9d7485db-8659n\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.370179 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.383658 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4ph\" (UniqueName: \"kubernetes.io/projected/080cf935-686b-449c-8c11-6a3c19039b78-kube-api-access-qb4ph\") pod \"apiserver-76f77b778f-b4wcw\" (UID: \"080cf935-686b-449c-8c11-6a3c19039b78\") " pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.396497 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9n5vv"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.396718 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.407614 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.412515 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8sr7\" (UniqueName: \"kubernetes.io/projected/b4251430-d927-4b5a-b0a2-a119c8109252-kube-api-access-m8sr7\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.421834 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj7dr\" (UniqueName: \"kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr\") pod \"marketplace-operator-79b997595-bs98t\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.452004 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwbg\" (UniqueName: \"kubernetes.io/projected/7b27933c-64bc-4259-9eb3-62faa9ae7fbb-kube-api-access-nbwbg\") pod \"apiserver-7bbb656c7d-dlnvv\" (UID: \"7b27933c-64bc-4259-9eb3-62faa9ae7fbb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.468352 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwpd\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-kube-api-access-dfwpd\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.489131 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhdcq\" (UniqueName: \"kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq\") pod \"collect-profiles-29498790-bxbrm\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.508416 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7956bd03-db5a-4524-85ac-184466ca0029-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dcj8j\" (UID: \"7956bd03-db5a-4524-85ac-184466ca0029\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.519139 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.525723 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltp8\" (UniqueName: \"kubernetes.io/projected/e40441f6-397d-4546-b5ec-62c6e936be97-kube-api-access-8ltp8\") pod \"machine-config-operator-74547568cd-ztkkq\" (UID: \"e40441f6-397d-4546-b5ec-62c6e936be97\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.538808 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfv4\" (UniqueName: \"kubernetes.io/projected/02ac825d-2a57-4917-9cd0-d8d058a8fb95-kube-api-access-ljfv4\") pod \"control-plane-machine-set-operator-78cbb6b69f-jnks5\" (UID: \"02ac825d-2a57-4917-9cd0-d8d058a8fb95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.547586 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.556385 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.561301 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.563918 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmszm\" (UniqueName: \"kubernetes.io/projected/1e5bbf75-17f7-4156-876c-8974e116f225-kube-api-access-gmszm\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljf2d\" (UID: \"1e5bbf75-17f7-4156-876c-8974e116f225\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.572328 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.579092 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.582045 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j8s6\" (UniqueName: \"kubernetes.io/projected/c853a7cc-059d-4757-951b-e094ae75d27f-kube-api-access-8j8s6\") pod \"catalog-operator-68c6474976-f4tq9\" (UID: \"c853a7cc-059d-4757-951b-e094ae75d27f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.589167 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.601912 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jllbk\" (UniqueName: \"kubernetes.io/projected/f24929dd-69fc-4c32-ae8a-65de2d609529-kube-api-access-jllbk\") pod \"openshift-apiserver-operator-796bbdcf4f-q2rwz\" (UID: \"f24929dd-69fc-4c32-ae8a-65de2d609529\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.608501 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.617562 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.621127 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.633411 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchv5\" (UniqueName: \"kubernetes.io/projected/7397ef95-4126-4f2e-9ba4-162440d6b87f-kube-api-access-nchv5\") pod \"csi-hostpathplugin-t2phc\" (UID: \"7397ef95-4126-4f2e-9ba4-162440d6b87f\") " pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.638742 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.644180 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl92b\" (UniqueName: \"kubernetes.io/projected/bae272c1-b2d0-4b06-8d7a-aa580f4c40e1-kube-api-access-wl92b\") pod \"machine-config-controller-84d6567774-th6nh\" (UID: \"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.644665 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.652205 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.683513 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89l2\" (UniqueName: \"kubernetes.io/projected/7c73d592-2bf8-4b99-abdc-2fdeea5f2245-kube-api-access-c89l2\") pod \"packageserver-d55dfcdfc-vjdk4\" (UID: \"7c73d592-2bf8-4b99-abdc-2fdeea5f2245\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.687802 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j92s8\" (UniqueName: \"kubernetes.io/projected/45f3b96f-5161-47ae-a33b-8a895303ae28-kube-api-access-j92s8\") pod \"olm-operator-6b444d44fb-n2974\" (UID: \"45f3b96f-5161-47ae-a33b-8a895303ae28\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.700896 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wfvhf"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.702631 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7msq\" (UniqueName: \"kubernetes.io/projected/a9fcb2d7-d6c8-49b4-9574-cad807b4310b-kube-api-access-s7msq\") pod \"package-server-manager-789f6589d5-fxg47\" (UID: \"a9fcb2d7-d6c8-49b4-9574-cad807b4310b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.702971 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.704714 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.714532 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qxwqf"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.714984 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.719809 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.723830 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.726631 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.728410 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.733407 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.740782 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.743358 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.747487 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.765784 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.766089 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 06:44:45 crc kubenswrapper[4546]: E0201 06:44:45.775716 4546 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:45 crc kubenswrapper[4546]: E0201 06:44:45.775768 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca podName:b4251430-d927-4b5a-b0a2-a119c8109252 nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.775753 +0000 UTC m=+117.426689005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca") pod "cluster-image-registry-operator-dc59b4c8b-mznt2" (UID: "b4251430-d927-4b5a-b0a2-a119c8109252") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:44:45 crc kubenswrapper[4546]: W0201 06:44:45.784087 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cdce34f_3d94_4efb_b9eb_627ce9da7031.slice/crio-8a609aa523e30e6dda3f5acdc4d3632aae17f671412ed40ac05a17213534f704 WatchSource:0}: Error finding container 8a609aa523e30e6dda3f5acdc4d3632aae17f671412ed40ac05a17213534f704: Status 404 returned error can't find the container with id 8a609aa523e30e6dda3f5acdc4d3632aae17f671412ed40ac05a17213534f704 Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.787255 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.806095 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.825914 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.836240 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc"] Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.853445 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892580 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e7c6c80-51df-45e3-86c7-4519f0a63582-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892626 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-certs\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892694 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d57c3b-9eb0-4f6d-b538-80a5072b170d-config\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892745 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-serving-cert\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892762 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892777 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892836 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7c6c80-51df-45e3-86c7-4519f0a63582-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892851 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmkb\" (UniqueName: \"kubernetes.io/projected/68fa9ddb-0c76-4535-964b-5cfe6af0333e-kube-api-access-qkmkb\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892944 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e473656b-7b9e-4c95-90e0-c67f074cafdc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.892973 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hk5\" (UniqueName: \"kubernetes.io/projected/72d57c3b-9eb0-4f6d-b538-80a5072b170d-kube-api-access-r6hk5\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893031 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxc5\" (UniqueName: \"kubernetes.io/projected/e473656b-7b9e-4c95-90e0-c67f074cafdc-kube-api-access-kgxc5\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893049 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58rd\" (UniqueName: \"kubernetes.io/projected/f0d698fd-3f42-4997-9c85-6bdb897795dd-kube-api-access-x58rd\") pod \"migrator-59844c95c7-6wz2l\" (UID: \"f0d698fd-3f42-4997-9c85-6bdb897795dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893089 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-config\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893113 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893127 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38c5642d-e433-49dc-9143-c7dd72739498-signing-key\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893166 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvmj\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893199 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48vtt\" (UniqueName: \"kubernetes.io/projected/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-kube-api-access-48vtt\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893231 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7c6c80-51df-45e3-86c7-4519f0a63582-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893267 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-service-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893330 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893348 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fssq9\" (UniqueName: \"kubernetes.io/projected/38c5642d-e433-49dc-9143-c7dd72739498-kube-api-access-fssq9\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.893814 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894359 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894389 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38c5642d-e433-49dc-9143-c7dd72739498-signing-cabundle\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894478 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d57c3b-9eb0-4f6d-b538-80a5072b170d-serving-cert\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894530 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-node-bootstrap-token\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.894555 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: E0201 06:44:45.894871 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.39484176 +0000 UTC m=+117.045777776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.897305 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.931708 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995036 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995479 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hk5\" (UniqueName: \"kubernetes.io/projected/72d57c3b-9eb0-4f6d-b538-80a5072b170d-kube-api-access-r6hk5\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995550 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxc5\" (UniqueName: \"kubernetes.io/projected/e473656b-7b9e-4c95-90e0-c67f074cafdc-kube-api-access-kgxc5\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995578 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58rd\" (UniqueName: \"kubernetes.io/projected/f0d698fd-3f42-4997-9c85-6bdb897795dd-kube-api-access-x58rd\") pod \"migrator-59844c95c7-6wz2l\" (UID: \"f0d698fd-3f42-4997-9c85-6bdb897795dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995595 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-config\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995677 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995795 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38c5642d-e433-49dc-9143-c7dd72739498-signing-key\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995831 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvmj\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995906 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48vtt\" (UniqueName: \"kubernetes.io/projected/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-kube-api-access-48vtt\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995924 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7c6c80-51df-45e3-86c7-4519f0a63582-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.995946 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl4g6\" (UniqueName: \"kubernetes.io/projected/31fbacb4-73b4-43a9-a823-17f9b9662c7e-kube-api-access-cl4g6\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:45 crc kubenswrapper[4546]: E0201 06:44:45.995999 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.495981349 +0000 UTC m=+117.146917365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996085 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-service-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996198 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996235 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fssq9\" (UniqueName: \"kubernetes.io/projected/38c5642d-e433-49dc-9143-c7dd72739498-kube-api-access-fssq9\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996270 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996306 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996348 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38c5642d-e433-49dc-9143-c7dd72739498-signing-cabundle\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996365 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31fbacb4-73b4-43a9-a823-17f9b9662c7e-cert\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996412 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996427 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d57c3b-9eb0-4f6d-b538-80a5072b170d-serving-cert\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996464 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996482 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-node-bootstrap-token\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996543 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e7c6c80-51df-45e3-86c7-4519f0a63582-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996572 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-certs\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996644 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d57c3b-9eb0-4f6d-b538-80a5072b170d-config\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996775 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-serving-cert\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.996871 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c20a1084-47a6-47f8-87dc-da0528e83b7c-config-volume\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997013 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997086 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997108 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20a1084-47a6-47f8-87dc-da0528e83b7c-metrics-tls\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997149 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7c6c80-51df-45e3-86c7-4519f0a63582-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997178 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmkb\" (UniqueName: \"kubernetes.io/projected/68fa9ddb-0c76-4535-964b-5cfe6af0333e-kube-api-access-qkmkb\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997242 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e473656b-7b9e-4c95-90e0-c67f074cafdc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.997271 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq2k7\" (UniqueName: \"kubernetes.io/projected/c20a1084-47a6-47f8-87dc-da0528e83b7c-kube-api-access-xq2k7\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.998338 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-service-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:45 crc kubenswrapper[4546]: I0201 06:44:45.998726 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-config\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.000781 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.001738 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.002152 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.502144645 +0000 UTC m=+117.153080662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.002911 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38c5642d-e433-49dc-9143-c7dd72739498-signing-cabundle\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.004481 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.005976 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7c6c80-51df-45e3-86c7-4519f0a63582-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.007413 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72d57c3b-9eb0-4f6d-b538-80a5072b170d-config\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.007523 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.013829 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-node-bootstrap-token\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.013890 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-serving-cert\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.014092 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72d57c3b-9eb0-4f6d-b538-80a5072b170d-serving-cert\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.014246 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/68fa9ddb-0c76-4535-964b-5cfe6af0333e-certs\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.016380 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.021310 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7c6c80-51df-45e3-86c7-4519f0a63582-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.022820 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38c5642d-e433-49dc-9143-c7dd72739498-signing-key\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.026357 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.027081 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e473656b-7b9e-4c95-90e0-c67f074cafdc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.045952 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hk5\" (UniqueName: \"kubernetes.io/projected/72d57c3b-9eb0-4f6d-b538-80a5072b170d-kube-api-access-r6hk5\") pod \"service-ca-operator-777779d784-hqcww\" (UID: \"72d57c3b-9eb0-4f6d-b538-80a5072b170d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.061364 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxc5\" (UniqueName: \"kubernetes.io/projected/e473656b-7b9e-4c95-90e0-c67f074cafdc-kube-api-access-kgxc5\") pod \"multus-admission-controller-857f4d67dd-hdscf\" (UID: \"e473656b-7b9e-4c95-90e0-c67f074cafdc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.088725 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58rd\" (UniqueName: \"kubernetes.io/projected/f0d698fd-3f42-4997-9c85-6bdb897795dd-kube-api-access-x58rd\") pod \"migrator-59844c95c7-6wz2l\" (UID: \"f0d698fd-3f42-4997-9c85-6bdb897795dd\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098178 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098298 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31fbacb4-73b4-43a9-a823-17f9b9662c7e-cert\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098344 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c20a1084-47a6-47f8-87dc-da0528e83b7c-config-volume\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098360 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20a1084-47a6-47f8-87dc-da0528e83b7c-metrics-tls\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098401 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq2k7\" (UniqueName: \"kubernetes.io/projected/c20a1084-47a6-47f8-87dc-da0528e83b7c-kube-api-access-xq2k7\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.098456 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl4g6\" (UniqueName: \"kubernetes.io/projected/31fbacb4-73b4-43a9-a823-17f9b9662c7e-kube-api-access-cl4g6\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.098611 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.598600024 +0000 UTC m=+117.249536040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.099743 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c20a1084-47a6-47f8-87dc-da0528e83b7c-config-volume\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.102903 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31fbacb4-73b4-43a9-a823-17f9b9662c7e-cert\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.103186 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c20a1084-47a6-47f8-87dc-da0528e83b7c-metrics-tls\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.115376 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.125516 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fssq9\" (UniqueName: \"kubernetes.io/projected/38c5642d-e433-49dc-9143-c7dd72739498-kube-api-access-fssq9\") pod \"service-ca-9c57cc56f-mqnml\" (UID: \"38c5642d-e433-49dc-9143-c7dd72739498\") " pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.145303 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5e7c6c80-51df-45e3-86c7-4519f0a63582-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-cfc5j\" (UID: \"5e7c6c80-51df-45e3-86c7-4519f0a63582\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.176284 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pk4cv"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.182801 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmkb\" (UniqueName: \"kubernetes.io/projected/68fa9ddb-0c76-4535-964b-5cfe6af0333e-kube-api-access-qkmkb\") pod \"machine-config-server-7mzw5\" (UID: \"68fa9ddb-0c76-4535-964b-5cfe6af0333e\") " pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.187636 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvmj\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.194789 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mcws5" event={"ID":"3c232787-4f08-451b-ab33-d78c86f00dc7","Type":"ContainerStarted","Data":"1871b814632704d8d61a0b027ea5445e14c5dfba46b23601e2c246f4752d3994"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.194900 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mcws5" event={"ID":"3c232787-4f08-451b-ab33-d78c86f00dc7","Type":"ContainerStarted","Data":"46f3235749c4a3575788b4e609728fdab45fb3f16aef4cd512150ecb6d9ab966"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.195812 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" event={"ID":"18dbc0ae-24aa-4377-90b5-52cff1a5e855","Type":"ContainerStarted","Data":"9e050a5bf217b7fa933f126d4530be3f06139b4605001b8a27b982ac747212f0"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.199280 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.199621 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.699609958 +0000 UTC m=+117.350545974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.202225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48vtt\" (UniqueName: \"kubernetes.io/projected/f8977c0f-4431-4756-8e65-6dbdfd1b9fbc-kube-api-access-48vtt\") pod \"authentication-operator-69f744f599-85r82\" (UID: \"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.206239 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.211463 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" event={"ID":"a6d2f6da-ac32-41d4-b1bb-ed5c96364254","Type":"ContainerStarted","Data":"af7ae06f4bb919b6b28353e9de5eacc33251c26b5995870d66adf7afd96fa441"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.213178 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.229033 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" event={"ID":"7cdce34f-3d94-4efb-b9eb-627ce9da7031","Type":"ContainerStarted","Data":"8a609aa523e30e6dda3f5acdc4d3632aae17f671412ed40ac05a17213534f704"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.234140 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" Feb 01 06:44:46 crc kubenswrapper[4546]: W0201 06:44:46.235320 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8fcf426_a005_4459_a161_17905ef2f5ea.slice/crio-a10015089eec75ff237b3db49bc44ae1d9739d5630e2aa24c016344cfed3f723 WatchSource:0}: Error finding container a10015089eec75ff237b3db49bc44ae1d9739d5630e2aa24c016344cfed3f723: Status 404 returned error can't find the container with id a10015089eec75ff237b3db49bc44ae1d9739d5630e2aa24c016344cfed3f723 Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.235746 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" event={"ID":"7e6c27ea-97c9-4f56-ad23-91cda30acf6b","Type":"ContainerStarted","Data":"98818afe462b4c63ded81cb79c453576c18144fb7bb030e5ffc1a27a692927f5"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.236873 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" event={"ID":"07764668-24b4-4b55-ba97-eaf6d205d497","Type":"ContainerStarted","Data":"c82c809024262e9ff228d368de51266560c4541da96b1c69dbca8a2a87520785"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.236893 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" event={"ID":"07764668-24b4-4b55-ba97-eaf6d205d497","Type":"ContainerStarted","Data":"2436678747ff7b6239d66fe3c8dece9eaa8a575899a5ef4842912f144c6413f7"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.237470 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.238738 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wfvhf" event={"ID":"86b65b33-e838-40a0-84fa-e7c2a659cc1d","Type":"ContainerStarted","Data":"d00715ea45c8ee1e823e31c5f6a12cc09ec0dd2321841d76dd0e2ff84ae10670"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.239313 4546 patch_prober.go:28] interesting pod/console-operator-58897d9998-9n5vv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.239339 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" podUID="07764668-24b4-4b55-ba97-eaf6d205d497" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.244560 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" event={"ID":"b3051be4-3bf1-4a18-8636-ed39c3a4c479","Type":"ContainerStarted","Data":"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.244579 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" event={"ID":"b3051be4-3bf1-4a18-8636-ed39c3a4c479","Type":"ContainerStarted","Data":"0c54578d2a054ee4e62b5d1d672985470c1a5bce58dd26f57b79aee91b03bdc7"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.244780 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.253836 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl4g6\" (UniqueName: \"kubernetes.io/projected/31fbacb4-73b4-43a9-a823-17f9b9662c7e-kube-api-access-cl4g6\") pod \"ingress-canary-mnvbs\" (UID: \"31fbacb4-73b4-43a9-a823-17f9b9662c7e\") " pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.256165 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.261185 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.262240 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b4wcw"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.262671 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq2k7\" (UniqueName: \"kubernetes.io/projected/c20a1084-47a6-47f8-87dc-da0528e83b7c-kube-api-access-xq2k7\") pod \"dns-default-9rzq5\" (UID: \"c20a1084-47a6-47f8-87dc-da0528e83b7c\") " pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.279191 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.288332 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.289418 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5jr6p"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.300987 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:46 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:46 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:46 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.301014 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.301021 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.301822 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.801808923 +0000 UTC m=+117.452744939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.301917 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.309818 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.326270 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" event={"ID":"6df707cc-8a5d-437b-b822-4a7f2360c18d","Type":"ContainerStarted","Data":"0ae8bb86cb8b876d9150124c5b5eb8852cdf27ebbfd71f0c4d936919add099d4"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.338640 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.375223 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mzw5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.381362 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.381617 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.388846 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.389012 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mnvbs" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.390818 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" event={"ID":"9760ca7f-b330-4ab0-ae37-57c150826f20","Type":"ContainerStarted","Data":"b3bccaadc65add06df497eaa46a9832f5282386d60ebcf7298fbc1a0b4d607c2"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.391476 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.399665 4546 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gzcwd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.402132 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.402770 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.403091 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:46.903081022 +0000 UTC m=+117.554017038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.407055 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8659n" event={"ID":"81d1f1d9-4f02-4d8e-946c-9cc1592090ae","Type":"ContainerStarted","Data":"95420f6555dc29b56292435133e15c3332a4bb93d7a736c9e0311ddc7c29afdd"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.412130 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.422591 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" event={"ID":"75263970-db40-455a-8873-d1cea12d384b","Type":"ContainerStarted","Data":"4f4c4621d59e04b83c19639f9e77aa42532f1ed74ae5a4c318be88e02d52d413"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.422625 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" event={"ID":"75263970-db40-455a-8873-d1cea12d384b","Type":"ContainerStarted","Data":"dda537d2d9dd165ab7b4a0fa2e389519c840744825f76ab5fca4e5e9f7b6e27e"} Feb 01 06:44:46 crc kubenswrapper[4546]: W0201 06:44:46.430214 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080cf935_686b_449c_8c11_6a3c19039b78.slice/crio-71e6d2edc4f89326db83a05e5a8ea6aaedec2914553b4e961c5ff940f3451aa4 WatchSource:0}: Error finding container 71e6d2edc4f89326db83a05e5a8ea6aaedec2914553b4e961c5ff940f3451aa4: Status 404 returned error can't find the container with id 71e6d2edc4f89326db83a05e5a8ea6aaedec2914553b4e961c5ff940f3451aa4 Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.433535 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" event={"ID":"3a65f9c1-682b-4818-a663-19b9c5281d78","Type":"ContainerStarted","Data":"ec65d2e79e7bfb1f57564018dc7f486de84d2a88f40e1104c60b49d56ba50258"} Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.483233 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.503196 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.504805 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.003421395 +0000 UTC m=+117.654357402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.507208 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.511278 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.011260778 +0000 UTC m=+117.662196795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.608490 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.609084 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.109063093 +0000 UTC m=+117.759999109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.609747 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.610085 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.110077013 +0000 UTC m=+117.761013029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.685213 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" podStartSLOduration=97.685191312 podStartE2EDuration="1m37.685191312s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:46.67563154 +0000 UTC m=+117.326567556" watchObservedRunningTime="2026-02-01 06:44:46.685191312 +0000 UTC m=+117.336127318" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.711353 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.711664 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.211650478 +0000 UTC m=+117.862586494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.760425 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.762990 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.775526 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.788099 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.802073 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.814239 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.814324 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.814966 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.314929738 +0000 UTC m=+117.965865743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.815629 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4251430-d927-4b5a-b0a2-a119c8109252-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mznt2\" (UID: \"b4251430-d927-4b5a-b0a2-a119c8109252\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.834608 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.905894 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mcws5" podStartSLOduration=97.905880609 podStartE2EDuration="1m37.905880609s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:46.866149811 +0000 UTC m=+117.517085827" watchObservedRunningTime="2026-02-01 06:44:46.905880609 +0000 UTC m=+117.556816625" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.917655 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:46 crc kubenswrapper[4546]: E0201 06:44:46.918249 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.418234364 +0000 UTC m=+118.069170380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.945657 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.958928 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.964992 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.966331 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t2phc"] Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.968485 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" podStartSLOduration=97.968469412 podStartE2EDuration="1m37.968469412s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:46.958981596 +0000 UTC m=+117.609917612" watchObservedRunningTime="2026-02-01 06:44:46.968469412 +0000 UTC m=+117.619405428" Feb 01 06:44:46 crc kubenswrapper[4546]: I0201 06:44:46.995115 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" podStartSLOduration=98.995083972 podStartE2EDuration="1m38.995083972s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:46.989635441 +0000 UTC m=+117.640571457" watchObservedRunningTime="2026-02-01 06:44:46.995083972 +0000 UTC m=+117.646019988" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.019883 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.020200 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.520188566 +0000 UTC m=+118.171124582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.089135 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hdscf"] Feb 01 06:44:47 crc kubenswrapper[4546]: W0201 06:44:47.089640 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7397ef95_4126_4f2e_9ba4_162440d6b87f.slice/crio-9bcfdfc814ac03900e4b7533e2d2c1f4b06cf7573652876fa3fca5d4a9cd3623 WatchSource:0}: Error finding container 9bcfdfc814ac03900e4b7533e2d2c1f4b06cf7573652876fa3fca5d4a9cd3623: Status 404 returned error can't find the container with id 9bcfdfc814ac03900e4b7533e2d2c1f4b06cf7573652876fa3fca5d4a9cd3623 Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.122030 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.122811 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.622793305 +0000 UTC m=+118.273729321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.138990 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" podStartSLOduration=97.138969767 podStartE2EDuration="1m37.138969767s" podCreationTimestamp="2026-02-01 06:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:47.138086234 +0000 UTC m=+117.789022250" watchObservedRunningTime="2026-02-01 06:44:47.138969767 +0000 UTC m=+117.789905783" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.214348 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l"] Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.223713 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.224111 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.724098146 +0000 UTC m=+118.375034162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.311833 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:47 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:47 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:47 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.314015 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.327621 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.328416 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.828402384 +0000 UTC m=+118.479338400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.434990 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.435472 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:47.935454909 +0000 UTC m=+118.586390924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.534070 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" event={"ID":"9760ca7f-b330-4ab0-ae37-57c150826f20","Type":"ContainerStarted","Data":"f46ca2c820fea1a0e0e140b147ce39ae6e363572f3f4fbb2313c85e07531e5da"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.536054 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.536432 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.036420269 +0000 UTC m=+118.687356286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.552144 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" event={"ID":"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1","Type":"ContainerStarted","Data":"fe883d0ecd126f83edce8a5cdc5dac6826a2c2e271d143a896abcc084412eb60"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.557524 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.567473 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" event={"ID":"080cf935-686b-449c-8c11-6a3c19039b78","Type":"ContainerStarted","Data":"71e6d2edc4f89326db83a05e5a8ea6aaedec2914553b4e961c5ff940f3451aa4"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.571497 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mzw5" event={"ID":"68fa9ddb-0c76-4535-964b-5cfe6af0333e","Type":"ContainerStarted","Data":"c2fb0dcc8877187cdbacd4f7ae31bb58e0d797d6e64c60d63bb8be0af880fe9e"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.575547 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" event={"ID":"e8fcf426-a005-4459-a161-17905ef2f5ea","Type":"ContainerStarted","Data":"a10015089eec75ff237b3db49bc44ae1d9739d5630e2aa24c016344cfed3f723"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.583370 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" event={"ID":"e40441f6-397d-4546-b5ec-62c6e936be97","Type":"ContainerStarted","Data":"26d8d3924d66c751bf06a60c052323f546d9257bbab26b636cd13a34274cbe58"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.593057 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" event={"ID":"45f3b96f-5161-47ae-a33b-8a895303ae28","Type":"ContainerStarted","Data":"09c3b4d6fe7afcd9bbbe4425c0842a2becc2cbc43939303f204a0d318baa2ff9"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.604600 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" event={"ID":"7397ef95-4126-4f2e-9ba4-162440d6b87f","Type":"ContainerStarted","Data":"9bcfdfc814ac03900e4b7533e2d2c1f4b06cf7573652876fa3fca5d4a9cd3623"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.608574 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" event={"ID":"7956bd03-db5a-4524-85ac-184466ca0029","Type":"ContainerStarted","Data":"8f0506ad273a2d831fdf2151b05b023b8c57c197005691646bf10acb13162a9e"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.642823 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" event={"ID":"1e5bbf75-17f7-4156-876c-8974e116f225","Type":"ContainerStarted","Data":"fc93bfc07b2a12e5df7275d2e577c1e05543c2498da8eae2feccc2d52f64ba8f"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.645170 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.646105 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.146090032 +0000 UTC m=+118.797026038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.747999 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.748247 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.248224495 +0000 UTC m=+118.899160510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.748499 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.749266 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.249253983 +0000 UTC m=+118.900189999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.749421 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8659n" podStartSLOduration=99.749408063 podStartE2EDuration="1m39.749408063s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:47.747319931 +0000 UTC m=+118.398255937" watchObservedRunningTime="2026-02-01 06:44:47.749408063 +0000 UTC m=+118.400344070" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.786740 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" event={"ID":"0904ae3e-72bf-4b72-9c6b-734d840b9cf5","Type":"ContainerStarted","Data":"785b532f6674809ecba271333a7ab739762411fb28b0f531f629c083eb2ae142"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.786773 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" event={"ID":"7cdce34f-3d94-4efb-b9eb-627ce9da7031","Type":"ContainerStarted","Data":"1d9526ecdd1cdfd5a51123ea86367bd7b9261928efd53501aafd0d806ea064a3"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.802114 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" event={"ID":"e473656b-7b9e-4c95-90e0-c67f074cafdc","Type":"ContainerStarted","Data":"db7620ba1117b0da1ea475e937b8e8bf5712da2cc1d7b150c1c6f224677a0de5"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.822084 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" event={"ID":"c853a7cc-059d-4757-951b-e094ae75d27f","Type":"ContainerStarted","Data":"42a04c88d538c27a89d7ba2ac97edcc2dae023f0a8c2a21df6e0c27a164fed8e"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.832569 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" event={"ID":"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7","Type":"ContainerStarted","Data":"49e1d65dda036ff50087725d34a3160ba948c15ff707f9a834b9e6356534e93b"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.833655 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" event={"ID":"f0d698fd-3f42-4997-9c85-6bdb897795dd","Type":"ContainerStarted","Data":"ef2dfd29dc408b15982261535a5a5de204f748d0b9178b55663d4e0b5b006b62"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.853873 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:47 crc kubenswrapper[4546]: E0201 06:44:47.854177 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.354163953 +0000 UTC m=+119.005099968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.883754 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" event={"ID":"7e6c27ea-97c9-4f56-ad23-91cda30acf6b","Type":"ContainerStarted","Data":"f8b8b810166adb5a47ca699a10a8a9d247ca3b76a9170de8e82ba79b652307a2"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.912367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" event={"ID":"a6d2f6da-ac32-41d4-b1bb-ed5c96364254","Type":"ContainerStarted","Data":"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.913332 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.917322 4546 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9n59f container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.917364 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.929396 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tnrrk" podStartSLOduration=98.929380655 podStartE2EDuration="1m38.929380655s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:47.928663405 +0000 UTC m=+118.579599421" watchObservedRunningTime="2026-02-01 06:44:47.929380655 +0000 UTC m=+118.580316671" Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.949100 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d4zzt" event={"ID":"3a65f9c1-682b-4818-a663-19b9c5281d78","Type":"ContainerStarted","Data":"089d8ac2b69c34d7a35434771bb2fd919a8e0dac956eeb92e71b2b5cf11ff0f8"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.965135 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" event={"ID":"f24929dd-69fc-4c32-ae8a-65de2d609529","Type":"ContainerStarted","Data":"c2119cf8987cdfe2b5081f4271d124eb904fb5482a124d60b81e3cb107725b86"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.974361 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" event={"ID":"18dbc0ae-24aa-4377-90b5-52cff1a5e855","Type":"ContainerStarted","Data":"d64dd679a2e7597d17d8b366db490eed131d158363bca6fcbc16036c9e54ecfc"} Feb 01 06:44:47 crc kubenswrapper[4546]: I0201 06:44:47.992576 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" event={"ID":"7b27933c-64bc-4259-9eb3-62faa9ae7fbb","Type":"ContainerStarted","Data":"3f6a1ab3991b2114f40da94143d5d05d836790f57286c0ea8ffff8a500070c18"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.055228 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" event={"ID":"a9fcb2d7-d6c8-49b4-9574-cad807b4310b","Type":"ContainerStarted","Data":"67a9e42a5a965556851cb2665710a9542920ca941ccb83fdc5463daff010c7b9"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.073798 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.085806 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.585784816 +0000 UTC m=+119.236720821 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.100604 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" podStartSLOduration=99.100585648 podStartE2EDuration="1m39.100585648s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:48.09977977 +0000 UTC m=+118.750715786" watchObservedRunningTime="2026-02-01 06:44:48.100585648 +0000 UTC m=+118.751521664" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.130484 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9rzq5"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.146724 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qxwqf" podStartSLOduration=99.146708242 podStartE2EDuration="1m39.146708242s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:48.146325021 +0000 UTC m=+118.797261037" watchObservedRunningTime="2026-02-01 06:44:48.146708242 +0000 UTC m=+118.797644258" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.169385 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" event={"ID":"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b","Type":"ContainerStarted","Data":"f87ec3230cd8b16e0a18dec351eac889c2a7c839ed58ae81dbd03977c6be6374"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.169431 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" event={"ID":"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b","Type":"ContainerStarted","Data":"a7e9b77c515a987d9f7b55825e9d4458376c8dc4a6664a40337791e655be0eff"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.175543 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.175990 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.675969316 +0000 UTC m=+119.326905331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.197877 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-85r82"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.206444 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" event={"ID":"7c73d592-2bf8-4b99-abdc-2fdeea5f2245","Type":"ContainerStarted","Data":"730a14923ef94d2055e08a0b71320578ff8b58f4d209352889d62536279532ff"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.245692 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hqcww"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.287572 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.287927 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.787911657 +0000 UTC m=+119.438847673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.293403 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:48 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:48 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:48 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.293438 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.295624 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" event={"ID":"75263970-db40-455a-8873-d1cea12d384b","Type":"ContainerStarted","Data":"86884c0b632fc6e784d8d0be10bf97f5e701982ade4a0bb03bc1adbd854175d6"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.305809 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" event={"ID":"02ac825d-2a57-4917-9cd0-d8d058a8fb95","Type":"ContainerStarted","Data":"9b3b9b110ab87b11d17e59b1c692d6f51564c40d58b720729eb5d8fc66dc5ba4"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.335890 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kv9mn" podStartSLOduration=100.335875369 podStartE2EDuration="1m40.335875369s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:48.3347606 +0000 UTC m=+118.985696616" watchObservedRunningTime="2026-02-01 06:44:48.335875369 +0000 UTC m=+118.986811386" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.336948 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wfvhf" event={"ID":"86b65b33-e838-40a0-84fa-e7c2a659cc1d","Type":"ContainerStarted","Data":"2b2c2e04fe50060ed1511a6e935f9a8c200a67cc3b79842d0e6f0d1edcbbbcaf"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.338100 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.349130 4546 patch_prober.go:28] interesting pod/downloads-7954f5f757-wfvhf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.349177 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wfvhf" podUID="86b65b33-e838-40a0-84fa-e7c2a659cc1d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.353153 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" event={"ID":"6df707cc-8a5d-437b-b822-4a7f2360c18d","Type":"ContainerStarted","Data":"71eb1c07f2cb577873730dfa1fae4bc00cb05bcd64f62e8fc16fabf8d77d6902"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.378498 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8659n" event={"ID":"81d1f1d9-4f02-4d8e-946c-9cc1592090ae","Type":"ContainerStarted","Data":"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.391187 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.392166 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.89214963 +0000 UTC m=+119.543085637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.397966 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" event={"ID":"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697","Type":"ContainerStarted","Data":"a2d75f6837aa7f596e840c8d8498bfe80d80474548b073422ec7495b18a9c25f"} Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.424380 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9n5vv" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.435544 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wfvhf" podStartSLOduration=99.435533198 podStartE2EDuration="1m39.435533198s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:48.396948197 +0000 UTC m=+119.047884213" watchObservedRunningTime="2026-02-01 06:44:48.435533198 +0000 UTC m=+119.086469215" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.435866 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mnvbs"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.454563 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vmpqc" podStartSLOduration=99.454542545 podStartE2EDuration="1m39.454542545s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:48.452254767 +0000 UTC m=+119.103190783" watchObservedRunningTime="2026-02-01 06:44:48.454542545 +0000 UTC m=+119.105478561" Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.493162 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.497835 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:48.997819511 +0000 UTC m=+119.648755527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.534674 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mqnml"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.543036 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j"] Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.594136 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.594426 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.094414412 +0000 UTC m=+119.745350428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: W0201 06:44:48.665393 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c5642d_e433_49dc_9143_c7dd72739498.slice/crio-1a69ac606a4f0f9eb606bb01b72075866bc0edcf3dba7fb551a76eabe4630ec3 WatchSource:0}: Error finding container 1a69ac606a4f0f9eb606bb01b72075866bc0edcf3dba7fb551a76eabe4630ec3: Status 404 returned error can't find the container with id 1a69ac606a4f0f9eb606bb01b72075866bc0edcf3dba7fb551a76eabe4630ec3 Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.694971 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.695281 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.19527186 +0000 UTC m=+119.846207876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.799729 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.800280 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.300262531 +0000 UTC m=+119.951198548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.896008 4546 csr.go:261] certificate signing request csr-lqddj is approved, waiting to be issued Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.905877 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:48 crc kubenswrapper[4546]: E0201 06:44:48.906276 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.406261341 +0000 UTC m=+120.057197357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:48 crc kubenswrapper[4546]: I0201 06:44:48.911996 4546 csr.go:257] certificate signing request csr-lqddj is issued Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.010518 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.018079 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.51805329 +0000 UTC m=+120.168989306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.023928 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.024465 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.524454285 +0000 UTC m=+120.175390300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.087237 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2"] Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.129695 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.130396 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.630381169 +0000 UTC m=+120.281317185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.233083 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.233467 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.733455471 +0000 UTC m=+120.384391487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.287000 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:49 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:49 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:49 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.287038 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.335687 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.336370 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.836355957 +0000 UTC m=+120.487291972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.439870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.440174 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:49.940164843 +0000 UTC m=+120.591100858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.449479 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" event={"ID":"b4251430-d927-4b5a-b0a2-a119c8109252","Type":"ContainerStarted","Data":"62e80433bdad36c86fc1af6d2e573f911101a9510be87d994c462e388b90dbd4"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.463201 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" event={"ID":"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1","Type":"ContainerStarted","Data":"cc11776baa08ffd7dd388a6df3e6cba937c70d2ab37fa0d8d47e9ef085356004"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.509774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9rzq5" event={"ID":"c20a1084-47a6-47f8-87dc-da0528e83b7c","Type":"ContainerStarted","Data":"97a1b2a97b2133bfa8ab5f3aee40ecb49139245a4bffacf0d2859f771604d116"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.510065 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9rzq5" event={"ID":"c20a1084-47a6-47f8-87dc-da0528e83b7c","Type":"ContainerStarted","Data":"cacf82d87a2d120ea41bc068e09a2425e025c816c89a338631756edcc6c3d43b"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.524763 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" podStartSLOduration=100.524752442 podStartE2EDuration="1m40.524752442s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:49.523254571 +0000 UTC m=+120.174190577" watchObservedRunningTime="2026-02-01 06:44:49.524752442 +0000 UTC m=+120.175688458" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.530944 4546 generic.go:334] "Generic (PLEG): container finished" podID="080cf935-686b-449c-8c11-6a3c19039b78" containerID="222ab0250ed269845594cbe54c9c2069303e76edbe5a69368469fb4ee0cbff63" exitCode=0 Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.530989 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" event={"ID":"080cf935-686b-449c-8c11-6a3c19039b78","Type":"ContainerDied","Data":"222ab0250ed269845594cbe54c9c2069303e76edbe5a69368469fb4ee0cbff63"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.540275 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" event={"ID":"72d57c3b-9eb0-4f6d-b538-80a5072b170d","Type":"ContainerStarted","Data":"8e097e0de8c547a9b4ea14b85e5de44427bcf637dbfd53d9939614d954e44a2c"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.540304 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" event={"ID":"72d57c3b-9eb0-4f6d-b538-80a5072b170d","Type":"ContainerStarted","Data":"40bc89a95d14e4a919ff12e96c5809e375861ac58b5e4a9a3308da5ff198e7cd"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.544974 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.545837 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.045826136 +0000 UTC m=+120.696762152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.558637 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" event={"ID":"f24929dd-69fc-4c32-ae8a-65de2d609529","Type":"ContainerStarted","Data":"66842af8e5c9ae2d7f7b8f9535a0437c25b82bcd97cc81fd5b79af15b41b2a9b"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.584062 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" event={"ID":"1e5bbf75-17f7-4156-876c-8974e116f225","Type":"ContainerStarted","Data":"66ab6a508669f4106e9d14378b9120197ed84c2d82ffb615e8ec518b2db4034d"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.588694 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" event={"ID":"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7","Type":"ContainerStarted","Data":"eef7d6b20fd8acb44a813f1514e33b0d6b8f321796ee50e4105aba6862e276b4"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.600492 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mnvbs" event={"ID":"31fbacb4-73b4-43a9-a823-17f9b9662c7e","Type":"ContainerStarted","Data":"5d7d2962f1165e5d10a993cb129a223eac6430e6214ed177b6b159e2ad076896"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.600532 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mnvbs" event={"ID":"31fbacb4-73b4-43a9-a823-17f9b9662c7e","Type":"ContainerStarted","Data":"6a4a1ed924ca9e32268873757cd8fc2db3acbcd89afce32908a9d883bd6274c6"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.602204 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" event={"ID":"45f3b96f-5161-47ae-a33b-8a895303ae28","Type":"ContainerStarted","Data":"1ac9360c7c68506ca9bb5e0015bad9634e2276c8be2130b9298416c2d9046b9b"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.602812 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.629818 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" event={"ID":"0904ae3e-72bf-4b72-9c6b-734d840b9cf5","Type":"ContainerStarted","Data":"30fd37b455f8af9eb03ce8fc4accd607a66296af0632af56c6f814f570eacd0f"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.647398 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.647916 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hqcww" podStartSLOduration=99.647905496 podStartE2EDuration="1m39.647905496s" podCreationTimestamp="2026-02-01 06:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:49.646335409 +0000 UTC m=+120.297271424" watchObservedRunningTime="2026-02-01 06:44:49.647905496 +0000 UTC m=+120.298841512" Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.649580 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.149568748 +0000 UTC m=+120.800504764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.652873 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.750363 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.765248 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.265211518 +0000 UTC m=+120.916147535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.775431 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" event={"ID":"c853a7cc-059d-4757-951b-e094ae75d27f","Type":"ContainerStarted","Data":"752f3db7b795becef02cf2828ce360c461872c120e613be951178c4a58450b72"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.790716 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2rwz" podStartSLOduration=100.790690578 podStartE2EDuration="1m40.790690578s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:49.777242384 +0000 UTC m=+120.428178400" watchObservedRunningTime="2026-02-01 06:44:49.790690578 +0000 UTC m=+120.441626595" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.794523 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.812806 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.831088 4546 generic.go:334] "Generic (PLEG): container finished" podID="21610d9b-73c6-4b4c-bc13-032e6f2b0f3b" containerID="f87ec3230cd8b16e0a18dec351eac889c2a7c839ed58ae81dbd03977c6be6374" exitCode=0 Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.831219 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" event={"ID":"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b","Type":"ContainerDied","Data":"f87ec3230cd8b16e0a18dec351eac889c2a7c839ed58ae81dbd03977c6be6374"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.831313 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" event={"ID":"21610d9b-73c6-4b4c-bc13-032e6f2b0f3b","Type":"ContainerStarted","Data":"8b7d057c5ef4e7e505ac4c9243accab70a1ec3b1652ffdc1ddeda66171396691"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.831715 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.853908 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" event={"ID":"7956bd03-db5a-4524-85ac-184466ca0029","Type":"ContainerStarted","Data":"ca304dd09fcf01e1718a436c3e7c4aae29c702de169689021308124930858843"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.853957 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" event={"ID":"7956bd03-db5a-4524-85ac-184466ca0029","Type":"ContainerStarted","Data":"f853e24150623b7c63ef203894361f994d0d8f1c24b8ea76ada5c423943c61bb"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.867581 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.869314 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.369300111 +0000 UTC m=+121.020236127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.875105 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" event={"ID":"f0d698fd-3f42-4997-9c85-6bdb897795dd","Type":"ContainerStarted","Data":"c51fcb43efb4896ef32e9ed7b47a1bb72f0f79e9f3c6125859bf0dcd7b2f18ec"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.878159 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" event={"ID":"e8fcf426-a005-4459-a161-17905ef2f5ea","Type":"ContainerStarted","Data":"435e6b81af6016c7b8ea7469992a3c495456746f85aaac212bc73b3d56a8c9aa"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.878260 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" event={"ID":"e8fcf426-a005-4459-a161-17905ef2f5ea","Type":"ContainerStarted","Data":"f78d18e945ea3f7141914d816ca718849508a5b7f67854dcfa3706cf9bc9c2f0"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.910599 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" event={"ID":"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc","Type":"ContainerStarted","Data":"858f3a6d9aa09a5036259ca1cdc5f4ed9ab1d0d3eac39d08308f79021b959ce4"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.910914 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" event={"ID":"f8977c0f-4431-4756-8e65-6dbdfd1b9fbc","Type":"ContainerStarted","Data":"7ead5cfc4b0670bbd2185191fafbafab67beb786c16d6520e305c61c404e6ba4"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.912980 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-01 06:39:48 +0000 UTC, rotation deadline is 2026-12-09 18:25:37.914653334 +0000 UTC Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.913028 4546 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7475h40m48.001627011s for next certificate rotation Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.944299 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" event={"ID":"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697","Type":"ContainerStarted","Data":"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.944360 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.955563 4546 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs98t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.955612 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.968219 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.968866 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" event={"ID":"5e7c6c80-51df-45e3-86c7-4519f0a63582","Type":"ContainerStarted","Data":"651df4fa8401ba010fddb122a88591dd559c4955b774c72fac919f8039e794ad"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.970817 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" event={"ID":"5e7c6c80-51df-45e3-86c7-4519f0a63582","Type":"ContainerStarted","Data":"978ecc99293bb4d4ec031ee8e8d6e50ce536f91bfac9cc2fc70284713407b70f"} Feb 01 06:44:49 crc kubenswrapper[4546]: E0201 06:44:49.970932 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.470919204 +0000 UTC m=+121.121855219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.982467 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n2974" podStartSLOduration=100.982439468 podStartE2EDuration="1m40.982439468s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:49.980703528 +0000 UTC m=+120.631639544" watchObservedRunningTime="2026-02-01 06:44:49.982439468 +0000 UTC m=+120.633375484" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.982906 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" podStartSLOduration=100.982899463 podStartE2EDuration="1m40.982899463s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:49.907172028 +0000 UTC m=+120.558108045" watchObservedRunningTime="2026-02-01 06:44:49.982899463 +0000 UTC m=+120.633835470" Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.999303 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" event={"ID":"38c5642d-e433-49dc-9143-c7dd72739498","Type":"ContainerStarted","Data":"794bf05a3d06cf9927125ad024f3152a16801a3846100c866ca950b14dbe1a6b"} Feb 01 06:44:49 crc kubenswrapper[4546]: I0201 06:44:49.999346 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" event={"ID":"38c5642d-e433-49dc-9143-c7dd72739498","Type":"ContainerStarted","Data":"1a69ac606a4f0f9eb606bb01b72075866bc0edcf3dba7fb551a76eabe4630ec3"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.022524 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" event={"ID":"e40441f6-397d-4546-b5ec-62c6e936be97","Type":"ContainerStarted","Data":"183cba7a7dfdfaef2dd5540a6cd062e5b4ac864e25da96d173c9bf2f32322da5"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.022567 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" event={"ID":"e40441f6-397d-4546-b5ec-62c6e936be97","Type":"ContainerStarted","Data":"29a9189e567cb550ce8d145c0be1bad9bd844bb69124c7b86e1452df418b5bc1"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.033373 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mzw5" event={"ID":"68fa9ddb-0c76-4535-964b-5cfe6af0333e","Type":"ContainerStarted","Data":"ae185ca91c5b7c61fbd8e76253ad646d0f8166f79fdaef81d49bb0eefa14bfc8"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.047316 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" event={"ID":"e473656b-7b9e-4c95-90e0-c67f074cafdc","Type":"ContainerStarted","Data":"149cf89ffc7c2eacc211a01ce53782c60b5ffc75aba3386f1b6d3e111668325a"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.067246 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" event={"ID":"7e6c27ea-97c9-4f56-ad23-91cda30acf6b","Type":"ContainerStarted","Data":"1d9acdee91552af2af832650a86810ed3ea20f2c657fe48438395f9a8489d9bb"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.079447 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" event={"ID":"02ac825d-2a57-4917-9cd0-d8d058a8fb95","Type":"ContainerStarted","Data":"a609992ec93eb4ded1a7e65fb0f150e0b6cacf94d72e9487adbde30337bd9b9f"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.082587 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.084203 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.584185318 +0000 UTC m=+121.235121335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.085847 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-f4tq9" podStartSLOduration=101.085833973 podStartE2EDuration="1m41.085833973s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.084472479 +0000 UTC m=+120.735408495" watchObservedRunningTime="2026-02-01 06:44:50.085833973 +0000 UTC m=+120.736769989" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.091913 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" event={"ID":"a9fcb2d7-d6c8-49b4-9574-cad807b4310b","Type":"ContainerStarted","Data":"2c6b19ba1da9de7492c539bd291efd084c88455611d948c60546487928b4c166"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.091947 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" event={"ID":"a9fcb2d7-d6c8-49b4-9574-cad807b4310b","Type":"ContainerStarted","Data":"fb830641c46265eae86a9ce31c33a48214be7a4c45cb2c0045da8049cc518615"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.092390 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.105074 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" event={"ID":"7c73d592-2bf8-4b99-abdc-2fdeea5f2245","Type":"ContainerStarted","Data":"8087e729b566b6a5187e29e39aff904035d89ef05b6f741de7b24d44a17f1cd5"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.106925 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.116402 4546 generic.go:334] "Generic (PLEG): container finished" podID="7b27933c-64bc-4259-9eb3-62faa9ae7fbb" containerID="f9d31039fe92e9a7eb8f5c53e1fbcb7e97b27b6d4c7f22a836efa5cf118c0e00" exitCode=0 Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.118049 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" event={"ID":"7b27933c-64bc-4259-9eb3-62faa9ae7fbb","Type":"ContainerDied","Data":"f9d31039fe92e9a7eb8f5c53e1fbcb7e97b27b6d4c7f22a836efa5cf118c0e00"} Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.118150 4546 patch_prober.go:28] interesting pod/downloads-7954f5f757-wfvhf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.118204 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wfvhf" podUID="86b65b33-e838-40a0-84fa-e7c2a659cc1d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.118272 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mnvbs" podStartSLOduration=7.118253894 podStartE2EDuration="7.118253894s" podCreationTimestamp="2026-02-01 06:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.116107812 +0000 UTC m=+120.767043828" watchObservedRunningTime="2026-02-01 06:44:50.118253894 +0000 UTC m=+120.769189900" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.131082 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.183466 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.185032 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.685015936 +0000 UTC m=+121.335951952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.222722 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" podStartSLOduration=101.222698476 podStartE2EDuration="1m41.222698476s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.209080221 +0000 UTC m=+120.860016238" watchObservedRunningTime="2026-02-01 06:44:50.222698476 +0000 UTC m=+120.873634492" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.257731 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljf2d" podStartSLOduration=101.25769498 podStartE2EDuration="1m41.25769498s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.246967148 +0000 UTC m=+120.897903164" watchObservedRunningTime="2026-02-01 06:44:50.25769498 +0000 UTC m=+120.908630996" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.285364 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.285690 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.785678796 +0000 UTC m=+121.436614813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.303071 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:50 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:50 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:50 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.303142 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.389356 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.389535 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.889514854 +0000 UTC m=+121.540450870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.390007 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.390393 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.890381646 +0000 UTC m=+121.541317662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.494419 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.494622 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.994591668 +0000 UTC m=+121.645527684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.494767 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.495290 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:50.995278661 +0000 UTC m=+121.646214677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.595644 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.596211 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.096197004 +0000 UTC m=+121.747133020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.697751 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.698182 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.198162999 +0000 UTC m=+121.849099015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.798651 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.798804 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.298782249 +0000 UTC m=+121.949718265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.799062 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.799358 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.299350529 +0000 UTC m=+121.950286545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.825473 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pk4cv" podStartSLOduration=101.825457702 podStartE2EDuration="1m41.825457702s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.823373136 +0000 UTC m=+121.474309152" watchObservedRunningTime="2026-02-01 06:44:50.825457702 +0000 UTC m=+121.476393718" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.899732 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.899896 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.39986942 +0000 UTC m=+122.050805435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.899959 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wnvc9" podStartSLOduration=102.899942477 podStartE2EDuration="1m42.899942477s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.896291281 +0000 UTC m=+121.547227296" watchObservedRunningTime="2026-02-01 06:44:50.899942477 +0000 UTC m=+121.550878492" Feb 01 06:44:50 crc kubenswrapper[4546]: I0201 06:44:50.900121 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:50 crc kubenswrapper[4546]: E0201 06:44:50.900715 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.400702799 +0000 UTC m=+122.051638815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.001312 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.001480 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.501449457 +0000 UTC m=+122.152385473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.001609 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.001880 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.501872053 +0000 UTC m=+122.152808069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.072228 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dcj8j" podStartSLOduration=102.072213626 podStartE2EDuration="1m42.072213626s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:50.997891147 +0000 UTC m=+121.648827163" watchObservedRunningTime="2026-02-01 06:44:51.072213626 +0000 UTC m=+121.723149643" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.102890 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.103022 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.603008707 +0000 UTC m=+122.253944723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.103217 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.103521 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.603513037 +0000 UTC m=+122.254449054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.106377 4546 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vjdk4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.106451 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" podUID="7c73d592-2bf8-4b99-abdc-2fdeea5f2245" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.123891 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-th6nh" event={"ID":"bae272c1-b2d0-4b06-8d7a-aa580f4c40e1","Type":"ContainerStarted","Data":"164cd9500f8e1f16b71e0d5453978bfb1f978907ee0cb6aee8230ecfb92c86ec"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.125211 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" event={"ID":"7397ef95-4126-4f2e-9ba4-162440d6b87f","Type":"ContainerStarted","Data":"da8e977356320823db8e03a3d0ff985b26a3776bffcb4fbce280b118d804910d"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.126793 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5jr6p" event={"ID":"cf99eb99-fd9e-4fd3-a184-f36e64c6b6c7","Type":"ContainerStarted","Data":"4aa2187bb44b1b623bddc2b504f4aae9c9276eb17bd80941ea1b064b7c190854"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.128918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" event={"ID":"f0d698fd-3f42-4997-9c85-6bdb897795dd","Type":"ContainerStarted","Data":"b97f21ecae47ea7662268525887a4c99e01df135a0d4b996d1413d59539f77ee"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.130702 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" event={"ID":"e473656b-7b9e-4c95-90e0-c67f074cafdc","Type":"ContainerStarted","Data":"b05ff3568e03c3520b948ec9b0bf28aad88f229cc5f81d9f3abc049199ad0c46"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.132770 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" event={"ID":"080cf935-686b-449c-8c11-6a3c19039b78","Type":"ContainerStarted","Data":"41e3918b1f548431b7e790963bacd983b32b198af8b5434e1a398c4c95d4a824"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.132794 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" event={"ID":"080cf935-686b-449c-8c11-6a3c19039b78","Type":"ContainerStarted","Data":"8d425e4c2524b1a621c44eed43fae10ac7d408bb8714202e89222be8122bd465"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.135236 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" event={"ID":"7b27933c-64bc-4259-9eb3-62faa9ae7fbb","Type":"ContainerStarted","Data":"31a4cb06b608c9989087ac98bec572e8c40bff37fb16913f92cd31f12ad0dd86"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.136554 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" event={"ID":"b4251430-d927-4b5a-b0a2-a119c8109252","Type":"ContainerStarted","Data":"347b3b1cdef98c1df4ad63a196221364e8d4bcb1c987c14350af3ce2fddc318e"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.140562 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9rzq5" event={"ID":"c20a1084-47a6-47f8-87dc-da0528e83b7c","Type":"ContainerStarted","Data":"d546d08f91149edb4e0658bc4ec734b1b8ed890791b5660bb063dc337c8227a8"} Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.140595 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9rzq5" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.143312 4546 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs98t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.143346 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.197384 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" podStartSLOduration=103.197372897 podStartE2EDuration="1m43.197372897s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.156823097 +0000 UTC m=+121.807759113" watchObservedRunningTime="2026-02-01 06:44:51.197372897 +0000 UTC m=+121.848308913" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.203783 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.203990 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.703966645 +0000 UTC m=+122.354902661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.213985 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.214812 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.714793813 +0000 UTC m=+122.365729829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.238581 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" podStartSLOduration=102.238563746 podStartE2EDuration="1m42.238563746s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.19786792 +0000 UTC m=+121.848803936" watchObservedRunningTime="2026-02-01 06:44:51.238563746 +0000 UTC m=+121.889499761" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.246482 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" podStartSLOduration=102.246467288 podStartE2EDuration="1m42.246467288s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.235696335 +0000 UTC m=+121.886632351" watchObservedRunningTime="2026-02-01 06:44:51.246467288 +0000 UTC m=+121.897403305" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.285213 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:51 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:51 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:51 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.285274 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.295479 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" podStartSLOduration=102.295462042 podStartE2EDuration="1m42.295462042s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.293537288 +0000 UTC m=+121.944473304" watchObservedRunningTime="2026-02-01 06:44:51.295462042 +0000 UTC m=+121.946398058" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.323609 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.324058 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.824041512 +0000 UTC m=+122.474977527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.324105 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.324401 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.824386991 +0000 UTC m=+122.475323007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.334442 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.339622 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.342427 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.351050 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jnks5" podStartSLOduration=102.351033591 podStartE2EDuration="1m42.351033591s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.340225688 +0000 UTC m=+121.991161704" watchObservedRunningTime="2026-02-01 06:44:51.351033591 +0000 UTC m=+122.001969607" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.354243 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.371354 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7mzw5" podStartSLOduration=9.37133951 podStartE2EDuration="9.37133951s" podCreationTimestamp="2026-02-01 06:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.369974759 +0000 UTC m=+122.020910775" watchObservedRunningTime="2026-02-01 06:44:51.37133951 +0000 UTC m=+122.022275526" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.425450 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.425690 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.925668369 +0000 UTC m=+122.576604384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.426071 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.426114 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.426160 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.426195 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxbd5\" (UniqueName: \"kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.426527 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:51.92649246 +0000 UTC m=+122.577428476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.499880 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-85r82" podStartSLOduration=102.499843429 podStartE2EDuration="1m42.499843429s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.474121051 +0000 UTC m=+122.125057067" watchObservedRunningTime="2026-02-01 06:44:51.499843429 +0000 UTC m=+122.150779436" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.502200 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.503065 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.510182 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.528991 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.529278 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.529325 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.529364 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxbd5\" (UniqueName: \"kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.529721 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.029706235 +0000 UTC m=+122.680642251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.530092 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.530304 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.569921 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-cfc5j" podStartSLOduration=102.569906737 podStartE2EDuration="1m42.569906737s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.551117555 +0000 UTC m=+122.202053562" watchObservedRunningTime="2026-02-01 06:44:51.569906737 +0000 UTC m=+122.220842744" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.589947 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxbd5\" (UniqueName: \"kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5\") pod \"community-operators-kn94x\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.594896 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.596008 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ztkkq" podStartSLOduration=102.595998042 podStartE2EDuration="1m42.595998042s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.586105413 +0000 UTC m=+122.237041428" watchObservedRunningTime="2026-02-01 06:44:51.595998042 +0000 UTC m=+122.246934058" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.630481 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.630562 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.630607 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.630643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lh8\" (UniqueName: \"kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.630914 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.130903293 +0000 UTC m=+122.781839308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.657001 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.657489 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-mqnml" podStartSLOduration=101.657463349 podStartE2EDuration="1m41.657463349s" podCreationTimestamp="2026-02-01 06:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.655052238 +0000 UTC m=+122.305988255" watchObservedRunningTime="2026-02-01 06:44:51.657463349 +0000 UTC m=+122.308399365" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.731468 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.731649 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.731693 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lh8\" (UniqueName: \"kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.731767 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.732469 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.732551 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.232537232 +0000 UTC m=+122.883473248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.732746 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.740464 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.741288 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.790531 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lh8\" (UniqueName: \"kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8\") pod \"certified-operators-fkgtj\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.792739 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.802771 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mznt2" podStartSLOduration=102.802758107 podStartE2EDuration="1m42.802758107s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.800225488 +0000 UTC m=+122.451161495" watchObservedRunningTime="2026-02-01 06:44:51.802758107 +0000 UTC m=+122.453694123" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.833308 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.833342 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n47v4\" (UniqueName: \"kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.833403 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.833453 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.833686 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.33367576 +0000 UTC m=+122.984611767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.836215 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.895978 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9rzq5" podStartSLOduration=8.895963516 podStartE2EDuration="8.895963516s" podCreationTimestamp="2026-02-01 06:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.88459074 +0000 UTC m=+122.535526755" watchObservedRunningTime="2026-02-01 06:44:51.895963516 +0000 UTC m=+122.546899532" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.896129 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hdscf" podStartSLOduration=102.896124649 podStartE2EDuration="1m42.896124649s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.83302803 +0000 UTC m=+122.483964047" watchObservedRunningTime="2026-02-01 06:44:51.896124649 +0000 UTC m=+122.547060665" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.918499 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.926450 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.944850 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.945222 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.945254 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n47v4\" (UniqueName: \"kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.945325 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.945735 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: E0201 06:44:51.945836 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.445813801 +0000 UTC m=+123.096749817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.946059 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.981833 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6wz2l" podStartSLOduration=102.981809195 podStartE2EDuration="1m42.981809195s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:51.943552322 +0000 UTC m=+122.594488338" watchObservedRunningTime="2026-02-01 06:44:51.981809195 +0000 UTC m=+122.632745201" Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.983370 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:44:51 crc kubenswrapper[4546]: I0201 06:44:51.991257 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n47v4\" (UniqueName: \"kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4\") pod \"community-operators-gbmsq\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.047808 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.048128 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8npnn\" (UniqueName: \"kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.048158 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.048181 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.048513 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.54849315 +0000 UTC m=+123.199429165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.055871 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.066124 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" podStartSLOduration=103.066104173 podStartE2EDuration="1m43.066104173s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:52.064594561 +0000 UTC m=+122.715530577" watchObservedRunningTime="2026-02-01 06:44:52.066104173 +0000 UTC m=+122.717040189" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.137451 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" podStartSLOduration=103.137433696 podStartE2EDuration="1m43.137433696s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:52.137231706 +0000 UTC m=+122.788167712" watchObservedRunningTime="2026-02-01 06:44:52.137433696 +0000 UTC m=+122.788369712" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.143532 4546 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vjdk4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.143590 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" podUID="7c73d592-2bf8-4b99-abdc-2fdeea5f2245" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.149283 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.149452 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.149573 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.149609 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8npnn\" (UniqueName: \"kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.149910 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.649898258 +0000 UTC m=+123.300834274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.150235 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.154252 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.159309 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" event={"ID":"7397ef95-4126-4f2e-9ba4-162440d6b87f","Type":"ContainerStarted","Data":"62f1c1890fa528abfcdb46d059c64a64edd921931351f7567c6dc446765e3b42"} Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.207823 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8npnn\" (UniqueName: \"kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn\") pod \"certified-operators-qzplm\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.254926 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.259088 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.759074091 +0000 UTC m=+123.410010107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.281211 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.283892 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vjdk4" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.288483 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:52 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:52 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:52 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.288549 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.356489 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.356933 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.856917523 +0000 UTC m=+123.507853529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.458306 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.459117 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:52.959103533 +0000 UTC m=+123.610039550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.559934 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.560134 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.060106996 +0000 UTC m=+123.711043002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.560328 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.560679 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.060671299 +0000 UTC m=+123.711607315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.662021 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.662172 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.162145388 +0000 UTC m=+123.813081404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.662244 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.662536 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.162528139 +0000 UTC m=+123.813464154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.763909 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.764073 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.264044857 +0000 UTC m=+123.914980874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.764671 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.764982 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.264971383 +0000 UTC m=+123.915907399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.866010 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.866060 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.366012487 +0000 UTC m=+124.016948502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.867135 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.867489 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.367463419 +0000 UTC m=+124.018399425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.892154 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.968137 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.968262 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.468239334 +0000 UTC m=+124.119175340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.968588 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:52 crc kubenswrapper[4546]: E0201 06:44:52.968925 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.468911259 +0000 UTC m=+124.119847275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:52 crc kubenswrapper[4546]: I0201 06:44:52.991727 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:44:53 crc kubenswrapper[4546]: W0201 06:44:53.008671 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbca0710d_d2ea_4726_84bb_0bf49d93a63a.slice/crio-9e092949252ef01552e03f5c1eb750dc5019dee92682572aa56d22588852f39c WatchSource:0}: Error finding container 9e092949252ef01552e03f5c1eb750dc5019dee92682572aa56d22588852f39c: Status 404 returned error can't find the container with id 9e092949252ef01552e03f5c1eb750dc5019dee92682572aa56d22588852f39c Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.069596 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.070007 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.569992949 +0000 UTC m=+124.220928965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.177362 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.177763 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.677748025 +0000 UTC m=+124.328684042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.184141 4546 generic.go:334] "Generic (PLEG): container finished" podID="47612608-8394-4713-b59a-172469b14bbc" containerID="d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a" exitCode=0 Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.184218 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerDied","Data":"d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.184264 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerStarted","Data":"70f82743a4aae04861a733f03b0e5e9b0c44cc4b28ec4366fc4869643723e0d3"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.196018 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.198057 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerStarted","Data":"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.198100 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerStarted","Data":"9e092949252ef01552e03f5c1eb750dc5019dee92682572aa56d22588852f39c"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.215220 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" event={"ID":"7397ef95-4126-4f2e-9ba4-162440d6b87f","Type":"ContainerStarted","Data":"ce7e867b7b87fe2cd4c99c9c409ebb96994a0bb0c9f0c74642bfbf1b72a2f2d5"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.215369 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" event={"ID":"7397ef95-4126-4f2e-9ba4-162440d6b87f","Type":"ContainerStarted","Data":"cb2b4eef4997d82845802411d554e44cbc841641566944bc9f823f71e431010f"} Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.238073 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.281810 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.283337 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.782809371 +0000 UTC m=+124.433745386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.293309 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:53 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:53 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:53 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.293524 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.300782 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-t2phc" podStartSLOduration=11.300765985 podStartE2EDuration="11.300765985s" podCreationTimestamp="2026-02-01 06:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:53.29743979 +0000 UTC m=+123.948375807" watchObservedRunningTime="2026-02-01 06:44:53.300765985 +0000 UTC m=+123.951701991" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.300901 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.301929 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.305120 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.330563 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.367448 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.384140 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwh2\" (UniqueName: \"kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.384297 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.384347 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.384418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.384809 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.884784322 +0000 UTC m=+124.535720338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.485783 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.485962 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.985941675 +0000 UTC m=+124.636877691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.486740 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.486866 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.487006 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.487180 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwh2\" (UniqueName: \"kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.487247 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.487271 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.487453 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:53.987430789 +0000 UTC m=+124.638366805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.512639 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwh2\" (UniqueName: \"kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2\") pod \"redhat-marketplace-vr449\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.538677 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.539325 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.541462 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.541596 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.549406 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.588407 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.588554 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.088528239 +0000 UTC m=+124.739464245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.588664 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.589005 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.088991552 +0000 UTC m=+124.739927568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.616908 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.689747 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.689928 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.189903653 +0000 UTC m=+124.840839669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.690058 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.690112 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.690150 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.690391 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.190383536 +0000 UTC m=+124.841319552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.712628 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.714910 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.743061 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791188 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.791381 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.2913504 +0000 UTC m=+124.942286416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791494 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791546 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wv4r\" (UniqueName: \"kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791570 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791599 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791634 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791717 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.791708 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.792022 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.292005815 +0000 UTC m=+124.942941830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.808375 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.892470 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.892655 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.392629262 +0000 UTC m=+125.043565268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893163 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893228 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wv4r\" (UniqueName: \"kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893268 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893318 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.893522 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.393510751 +0000 UTC m=+125.044446757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893810 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.893865 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.912778 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wv4r\" (UniqueName: \"kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r\") pod \"redhat-marketplace-9w4m8\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.951835 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.980331 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.993953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.994059 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.494034782 +0000 UTC m=+125.144970798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:53 crc kubenswrapper[4546]: I0201 06:44:53.994299 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:53 crc kubenswrapper[4546]: E0201 06:44:53.994670 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.494647886 +0000 UTC m=+125.145583903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.012896 4546 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.043033 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.096075 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.096394 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.596298948 +0000 UTC m=+125.247234965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.096876 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.097171 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.597157817 +0000 UTC m=+125.248093832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.189462 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.197725 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.198164 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.698148856 +0000 UTC m=+125.349084872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: W0201 06:44:54.209995 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod65085cec_539d_4e24_8e73_28bf135b5883.slice/crio-1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34 WatchSource:0}: Error finding container 1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34: Status 404 returned error can't find the container with id 1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.221642 4546 generic.go:334] "Generic (PLEG): container finished" podID="44bb6d15-c261-475e-9978-1d1495b630eb" containerID="f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93" exitCode=0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.221722 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerDied","Data":"f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.221759 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerStarted","Data":"bc6d850a0a54e5178320210026d94ad8b4d458dc13e83224c7b8ae10ce247407"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.223871 4546 generic.go:334] "Generic (PLEG): container finished" podID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerID="604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974" exitCode=0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.223919 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerDied","Data":"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.227895 4546 generic.go:334] "Generic (PLEG): container finished" podID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerID="61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648" exitCode=0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.227954 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerDied","Data":"61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.227977 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerStarted","Data":"f3487d969061c2fa99d277b1c0e05a1dbede2e7d98c0eb8a0bc9cb9e845bd247"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.230773 4546 generic.go:334] "Generic (PLEG): container finished" podID="0904ae3e-72bf-4b72-9c6b-734d840b9cf5" containerID="30fd37b455f8af9eb03ce8fc4accd607a66296af0632af56c6f814f570eacd0f" exitCode=0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.230840 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" event={"ID":"0904ae3e-72bf-4b72-9c6b-734d840b9cf5","Type":"ContainerDied","Data":"30fd37b455f8af9eb03ce8fc4accd607a66296af0632af56c6f814f570eacd0f"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.232024 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65085cec-539d-4e24-8e73-28bf135b5883","Type":"ContainerStarted","Data":"1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.238254 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerID="1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0" exitCode=0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.238718 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerDied","Data":"1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.238759 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerStarted","Data":"fed014177d6ae24b836984e9ad0ea594bd73c1b687fcdf6b89cca5109706cc1e"} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.280776 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.290310 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:54 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:54 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:54 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.290369 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.299755 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.300971 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.80095333 +0000 UTC m=+125.451889346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.400462 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.401398 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:54.901383223 +0000 UTC m=+125.552319239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.501643 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.502291 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:55.002266509 +0000 UTC m=+125.653202516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.553784 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfkxw" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.603417 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.603789 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:44:55.103776696 +0000 UTC m=+125.754712713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.679999 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.680905 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.684199 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.704563 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: E0201 06:44:54.704877 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:44:55.204837507 +0000 UTC m=+125.855773523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bt5sd" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.732384 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.780707 4546 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-01T06:44:54.012919995Z","Handler":null,"Name":""} Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.800235 4546 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.800276 4546 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.805601 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.805815 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdnzz\" (UniqueName: \"kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.805872 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.805890 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.813552 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.890277 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.891148 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.892660 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.894134 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.894223 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.906941 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdnzz\" (UniqueName: \"kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.906986 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.907006 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.907078 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.908212 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.908277 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.923589 4546 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.923614 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.951988 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdnzz\" (UniqueName: \"kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz\") pod \"redhat-operators-fknp6\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.966514 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bt5sd\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:54 crc kubenswrapper[4546]: I0201 06:44:54.998156 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.008686 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.008753 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.098136 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.099251 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.109721 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.109796 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.109985 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.133427 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.145314 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.165668 4546 patch_prober.go:28] interesting pod/downloads-7954f5f757-wfvhf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.165711 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wfvhf" podUID="86b65b33-e838-40a0-84fa-e7c2a659cc1d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.166062 4546 patch_prober.go:28] interesting pod/downloads-7954f5f757-wfvhf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.166088 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wfvhf" podUID="86b65b33-e838-40a0-84fa-e7c2a659cc1d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.206028 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.211839 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.211891 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftchc\" (UniqueName: \"kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.211963 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.223738 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.232174 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.278774 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.284212 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:55 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:55 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:55 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.289890 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.313163 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.313209 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftchc\" (UniqueName: \"kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.314027 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.315468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.316161 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.337540 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftchc\" (UniqueName: \"kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc\") pod \"redhat-operators-qqn2r\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.347307 4546 generic.go:334] "Generic (PLEG): container finished" podID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerID="a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06" exitCode=0 Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.347365 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerDied","Data":"a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06"} Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.347392 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerStarted","Data":"47986c449b6af641f4ce9c6a566d8ddab2be8d9a6c3c52fdbce0cad60cd976fc"} Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.361088 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.374213 4546 generic.go:334] "Generic (PLEG): container finished" podID="65085cec-539d-4e24-8e73-28bf135b5883" containerID="eed065262f976b23155ff6bdd910bb306607fccbfb2385bcfe93b4746c6cb9bb" exitCode=0 Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.374647 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65085cec-539d-4e24-8e73-28bf135b5883","Type":"ContainerDied","Data":"eed065262f976b23155ff6bdd910bb306607fccbfb2385bcfe93b4746c6cb9bb"} Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.408235 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.408272 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.410282 4546 patch_prober.go:28] interesting pod/console-f9d7485db-8659n container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.410332 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8659n" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.415822 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.562419 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.562696 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.575268 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.592229 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.592495 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.630544 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.693202 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.747251 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.751845 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.875663 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.904050 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:55 crc kubenswrapper[4546]: I0201 06:44:55.996216 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:44:56 crc kubenswrapper[4546]: W0201 06:44:56.006317 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91bca231_e9f9_42b7_aa32_db383a098b5b.slice/crio-38aee8309b676e82a8e79d50d6d0ba7fa35e07ba815ed877aa5bf89cdcba44ba WatchSource:0}: Error finding container 38aee8309b676e82a8e79d50d6d0ba7fa35e07ba815ed877aa5bf89cdcba44ba: Status 404 returned error can't find the container with id 38aee8309b676e82a8e79d50d6d0ba7fa35e07ba815ed877aa5bf89cdcba44ba Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.034292 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhdcq\" (UniqueName: \"kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq\") pod \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.034445 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") pod \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.034657 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") pod \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\" (UID: \"0904ae3e-72bf-4b72-9c6b-734d840b9cf5\") " Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.035652 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "0904ae3e-72bf-4b72-9c6b-734d840b9cf5" (UID: "0904ae3e-72bf-4b72-9c6b-734d840b9cf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.036701 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.044072 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0904ae3e-72bf-4b72-9c6b-734d840b9cf5" (UID: "0904ae3e-72bf-4b72-9c6b-734d840b9cf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.048179 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq" (OuterVolumeSpecName: "kube-api-access-bhdcq") pod "0904ae3e-72bf-4b72-9c6b-734d840b9cf5" (UID: "0904ae3e-72bf-4b72-9c6b-734d840b9cf5"). InnerVolumeSpecName "kube-api-access-bhdcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.138014 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhdcq\" (UniqueName: \"kubernetes.io/projected/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-kube-api-access-bhdcq\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.138038 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0904ae3e-72bf-4b72-9c6b-734d840b9cf5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.286289 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:56 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:56 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:56 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.286365 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.403892 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" event={"ID":"0904ae3e-72bf-4b72-9c6b-734d840b9cf5","Type":"ContainerDied","Data":"785b532f6674809ecba271333a7ab739762411fb28b0f531f629c083eb2ae142"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.404202 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785b532f6674809ecba271333a7ab739762411fb28b0f531f629c083eb2ae142" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.404030 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.436794 4546 generic.go:334] "Generic (PLEG): container finished" podID="318d8499-a380-4204-b4ee-15d2692874e3" containerID="8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a" exitCode=0 Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.436830 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerDied","Data":"8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.436881 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerStarted","Data":"1bb4b2e0137fb6c23084990fc45c7c322d8502bf9a590bb508721238815fc0c5"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.453658 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" event={"ID":"813828d1-6b58-42d0-a3e6-b5b0c67423c7","Type":"ContainerStarted","Data":"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.453699 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" event={"ID":"813828d1-6b58-42d0-a3e6-b5b0c67423c7","Type":"ContainerStarted","Data":"4a4a24880595dcd2c8ffd5c245e2260161e96a949299ea346ddd87381d8e4a1c"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.484984 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" podStartSLOduration=107.484966556 podStartE2EDuration="1m47.484966556s" podCreationTimestamp="2026-02-01 06:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:44:56.473521103 +0000 UTC m=+127.124457119" watchObservedRunningTime="2026-02-01 06:44:56.484966556 +0000 UTC m=+127.135902572" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.499600 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerStarted","Data":"38aee8309b676e82a8e79d50d6d0ba7fa35e07ba815ed877aa5bf89cdcba44ba"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.508134 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6","Type":"ContainerStarted","Data":"b6b0f4a0b47334280c529acc4775a145ef62f171c640093339bfded0afd53a05"} Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.511704 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-b4wcw" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.531077 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dlnvv" Feb 01 06:44:56 crc kubenswrapper[4546]: I0201 06:44:56.945017 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.061073 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir\") pod \"65085cec-539d-4e24-8e73-28bf135b5883\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.061198 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "65085cec-539d-4e24-8e73-28bf135b5883" (UID: "65085cec-539d-4e24-8e73-28bf135b5883"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.061241 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access\") pod \"65085cec-539d-4e24-8e73-28bf135b5883\" (UID: \"65085cec-539d-4e24-8e73-28bf135b5883\") " Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.061712 4546 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65085cec-539d-4e24-8e73-28bf135b5883-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.066763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "65085cec-539d-4e24-8e73-28bf135b5883" (UID: "65085cec-539d-4e24-8e73-28bf135b5883"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.163885 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/65085cec-539d-4e24-8e73-28bf135b5883-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.289690 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:57 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:57 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:57 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.289773 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.554258 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"65085cec-539d-4e24-8e73-28bf135b5883","Type":"ContainerDied","Data":"1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34"} Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.554293 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c353bbe8bd0e6cdbb88408a8cdf5939454a8ec4c20902349658661b3d074b34" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.554427 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.560526 4546 generic.go:334] "Generic (PLEG): container finished" podID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerID="6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da" exitCode=0 Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.560593 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerDied","Data":"6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da"} Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.565077 4546 generic.go:334] "Generic (PLEG): container finished" podID="c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" containerID="fe946b3155749c1a10d37895ec1c3675a662e4fe1520b1b2c14bd98b2cd3301a" exitCode=0 Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.565716 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6","Type":"ContainerDied","Data":"fe946b3155749c1a10d37895ec1c3675a662e4fe1520b1b2c14bd98b2cd3301a"} Feb 01 06:44:57 crc kubenswrapper[4546]: I0201 06:44:57.565803 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:44:58 crc kubenswrapper[4546]: I0201 06:44:58.281304 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:58 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:58 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:58 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:58 crc kubenswrapper[4546]: I0201 06:44:58.281372 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.012775 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.059130 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.116260 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access\") pod \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.116318 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir\") pod \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\" (UID: \"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6\") " Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.117716 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" (UID: "c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.125633 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" (UID: "c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.218428 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.218458 4546 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.280224 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:44:59 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:44:59 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:44:59 crc kubenswrapper[4546]: healthz check failed Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.280267 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.631322 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6","Type":"ContainerDied","Data":"b6b0f4a0b47334280c529acc4775a145ef62f171c640093339bfded0afd53a05"} Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.631364 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b0f4a0b47334280c529acc4775a145ef62f171c640093339bfded0afd53a05" Feb 01 06:44:59 crc kubenswrapper[4546]: I0201 06:44:59.631434 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.128448 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr"] Feb 01 06:45:00 crc kubenswrapper[4546]: E0201 06:45:00.131188 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131208 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: E0201 06:45:00.131220 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65085cec-539d-4e24-8e73-28bf135b5883" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131226 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="65085cec-539d-4e24-8e73-28bf135b5883" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: E0201 06:45:00.131241 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ae3e-72bf-4b72-9c6b-734d840b9cf5" containerName="collect-profiles" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131247 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ae3e-72bf-4b72-9c6b-734d840b9cf5" containerName="collect-profiles" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131371 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="65085cec-539d-4e24-8e73-28bf135b5883" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131383 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a9ce77-2ac1-4ca1-bcf2-60aa9dc783e6" containerName="pruner" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.131390 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ae3e-72bf-4b72-9c6b-734d840b9cf5" containerName="collect-profiles" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.134839 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr"] Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.134978 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.141727 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.142263 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.237776 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfthq\" (UniqueName: \"kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.237886 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.237923 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.283435 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:45:00 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:45:00 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:45:00 crc kubenswrapper[4546]: healthz check failed Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.283490 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.343849 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfthq\" (UniqueName: \"kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.343926 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.343955 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.345722 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.359325 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.363300 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfthq\" (UniqueName: \"kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq\") pod \"collect-profiles-29498805-t8nbr\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.473085 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:00 crc kubenswrapper[4546]: I0201 06:45:00.892919 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr"] Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.282066 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:45:01 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:45:01 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:45:01 crc kubenswrapper[4546]: healthz check failed Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.282456 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.385666 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9rzq5" Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.672601 4546 generic.go:334] "Generic (PLEG): container finished" podID="dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" containerID="933e09563ae018cedf41d60679f6ecf138c654b345a755f567f11bc247d7d4ba" exitCode=0 Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.672640 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" event={"ID":"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a","Type":"ContainerDied","Data":"933e09563ae018cedf41d60679f6ecf138c654b345a755f567f11bc247d7d4ba"} Feb 01 06:45:01 crc kubenswrapper[4546]: I0201 06:45:01.672664 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" event={"ID":"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a","Type":"ContainerStarted","Data":"5767d7ccbbd752e3c143f55b1257853e3a213dc9545a6c8101cdbcbd42e6081e"} Feb 01 06:45:02 crc kubenswrapper[4546]: I0201 06:45:02.286007 4546 patch_prober.go:28] interesting pod/router-default-5444994796-mcws5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:45:02 crc kubenswrapper[4546]: [-]has-synced failed: reason withheld Feb 01 06:45:02 crc kubenswrapper[4546]: [+]process-running ok Feb 01 06:45:02 crc kubenswrapper[4546]: healthz check failed Feb 01 06:45:02 crc kubenswrapper[4546]: I0201 06:45:02.286250 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mcws5" podUID="3c232787-4f08-451b-ab33-d78c86f00dc7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:45:03 crc kubenswrapper[4546]: I0201 06:45:03.281031 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:45:03 crc kubenswrapper[4546]: I0201 06:45:03.285517 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mcws5" Feb 01 06:45:05 crc kubenswrapper[4546]: I0201 06:45:05.174904 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wfvhf" Feb 01 06:45:05 crc kubenswrapper[4546]: I0201 06:45:05.409047 4546 patch_prober.go:28] interesting pod/console-f9d7485db-8659n container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 01 06:45:05 crc kubenswrapper[4546]: I0201 06:45:05.409125 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8659n" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.288187 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.411283 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume\") pod \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.415128 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume\") pod \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.415383 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfthq\" (UniqueName: \"kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq\") pod \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\" (UID: \"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a\") " Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.415741 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" (UID: "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.416306 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.420114 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq" (OuterVolumeSpecName: "kube-api-access-jfthq") pod "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" (UID: "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a"). InnerVolumeSpecName "kube-api-access-jfthq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.433032 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" (UID: "dfd9b242-1e4e-46f9-b8fb-04175b46cf9a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.517507 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfthq\" (UniqueName: \"kubernetes.io/projected/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-kube-api-access-jfthq\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.517540 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.778151 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" event={"ID":"dfd9b242-1e4e-46f9-b8fb-04175b46cf9a","Type":"ContainerDied","Data":"5767d7ccbbd752e3c143f55b1257853e3a213dc9545a6c8101cdbcbd42e6081e"} Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.778247 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5767d7ccbbd752e3c143f55b1257853e3a213dc9545a6c8101cdbcbd42e6081e" Feb 01 06:45:08 crc kubenswrapper[4546]: I0201 06:45:08.778313 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.237680 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.412026 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.415090 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.518625 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.518753 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.521539 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.522605 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.531466 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.540457 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.620821 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.621234 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.622288 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.632814 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.646933 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.653037 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.765656 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.772048 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:45:15 crc kubenswrapper[4546]: I0201 06:45:15.778831 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.838552 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.839271 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-89lh8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fkgtj_openshift-marketplace(47612608-8394-4713-b59a-172469b14bbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.840621 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fkgtj" podUID="47612608-8394-4713-b59a-172469b14bbc" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.861599 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.861733 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8npnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qzplm_openshift-marketplace(6d411dc4-ef2d-4e39-9111-e2ae62f83b37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:45:21 crc kubenswrapper[4546]: E0201 06:45:21.862926 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qzplm" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.293949 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fkgtj" podUID="47612608-8394-4713-b59a-172469b14bbc" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.294576 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qzplm" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.399722 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.399952 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwwh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vr449_openshift-marketplace(a4096fe8-44f5-466f-9d1c-9d32a9f7396e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.401042 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vr449" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.404542 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.404661 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wv4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9w4m8_openshift-marketplace(6378c03c-77b0-4d0d-8dd3-2b789468177a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.406440 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9w4m8" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" Feb 01 06:45:23 crc kubenswrapper[4546]: W0201 06:45:23.836509 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-805ddd5a8720da5204bfc21942cd0ac5227a9eb6633f3801b7c9b006b143dcb3 WatchSource:0}: Error finding container 805ddd5a8720da5204bfc21942cd0ac5227a9eb6633f3801b7c9b006b143dcb3: Status 404 returned error can't find the container with id 805ddd5a8720da5204bfc21942cd0ac5227a9eb6633f3801b7c9b006b143dcb3 Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.906100 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0d303bc9d8260cd62f2980d978fbd356b0a8f4fc8dcc5517cc8bfceefb703404"} Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.908082 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerStarted","Data":"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca"} Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.910329 4546 generic.go:334] "Generic (PLEG): container finished" podID="44bb6d15-c261-475e-9978-1d1495b630eb" containerID="7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96" exitCode=0 Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.910403 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerDied","Data":"7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96"} Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.913074 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerStarted","Data":"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790"} Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.928463 4546 generic.go:334] "Generic (PLEG): container finished" podID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerID="65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a" exitCode=0 Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.928565 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerDied","Data":"65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a"} Feb 01 06:45:23 crc kubenswrapper[4546]: I0201 06:45:23.933034 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"805ddd5a8720da5204bfc21942cd0ac5227a9eb6633f3801b7c9b006b143dcb3"} Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.939142 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vr449" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" Feb 01 06:45:23 crc kubenswrapper[4546]: E0201 06:45:23.939196 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9w4m8" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" Feb 01 06:45:24 crc kubenswrapper[4546]: W0201 06:45:24.004588 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-26d669fed69336e6e09d92891fa683800b9d4234b47ce08430b7b7fe3ffa43b9 WatchSource:0}: Error finding container 26d669fed69336e6e09d92891fa683800b9d4234b47ce08430b7b7fe3ffa43b9: Status 404 returned error can't find the container with id 26d669fed69336e6e09d92891fa683800b9d4234b47ce08430b7b7fe3ffa43b9 Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.246212 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.943546 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b77e7448196b6ef4525a73479d437a1a5b2695894a93fe4545fba123f4c93846"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.944262 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"26d669fed69336e6e09d92891fa683800b9d4234b47ce08430b7b7fe3ffa43b9"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.944517 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.945780 4546 generic.go:334] "Generic (PLEG): container finished" podID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerID="139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790" exitCode=0 Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.946044 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerDied","Data":"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.953249 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerStarted","Data":"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.954357 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0202501f27a7d66b4d3e66689319282af487a9f762803f3cab260173b4ba7f02"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.955514 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"540de6a4c43288d3f1774a5d9bdd53d18eb28761f5e8df5629223e69211add37"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.957664 4546 generic.go:334] "Generic (PLEG): container finished" podID="318d8499-a380-4204-b4ee-15d2692874e3" containerID="419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca" exitCode=0 Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.957760 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerDied","Data":"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.961090 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerStarted","Data":"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87"} Feb 01 06:45:24 crc kubenswrapper[4546]: I0201 06:45:24.995182 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kn94x" podStartSLOduration=3.792424626 podStartE2EDuration="33.995164792s" podCreationTimestamp="2026-02-01 06:44:51 +0000 UTC" firstStartedPulling="2026-02-01 06:44:54.233267859 +0000 UTC m=+124.884203876" lastFinishedPulling="2026-02-01 06:45:24.436008035 +0000 UTC m=+155.086944042" observedRunningTime="2026-02-01 06:45:24.990332191 +0000 UTC m=+155.641268207" watchObservedRunningTime="2026-02-01 06:45:24.995164792 +0000 UTC m=+155.646100808" Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.079153 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gbmsq" podStartSLOduration=3.916618508 podStartE2EDuration="34.079137613s" podCreationTimestamp="2026-02-01 06:44:51 +0000 UTC" firstStartedPulling="2026-02-01 06:44:54.238272854 +0000 UTC m=+124.889208870" lastFinishedPulling="2026-02-01 06:45:24.400791959 +0000 UTC m=+155.051727975" observedRunningTime="2026-02-01 06:45:25.073832063 +0000 UTC m=+155.724768079" watchObservedRunningTime="2026-02-01 06:45:25.079137613 +0000 UTC m=+155.730073629" Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.420825 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.421378 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.735197 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fxg47" Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.970679 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerStarted","Data":"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752"} Feb 01 06:45:25 crc kubenswrapper[4546]: I0201 06:45:25.975358 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerStarted","Data":"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397"} Feb 01 06:45:26 crc kubenswrapper[4546]: I0201 06:45:26.011354 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qqn2r" podStartSLOduration=3.077338031 podStartE2EDuration="31.011336026s" podCreationTimestamp="2026-02-01 06:44:55 +0000 UTC" firstStartedPulling="2026-02-01 06:44:57.580762785 +0000 UTC m=+128.231698801" lastFinishedPulling="2026-02-01 06:45:25.51476078 +0000 UTC m=+156.165696796" observedRunningTime="2026-02-01 06:45:26.008414926 +0000 UTC m=+156.659350932" watchObservedRunningTime="2026-02-01 06:45:26.011336026 +0000 UTC m=+156.662272033" Feb 01 06:45:26 crc kubenswrapper[4546]: I0201 06:45:26.028277 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fknp6" podStartSLOduration=3.016823823 podStartE2EDuration="32.028263182s" podCreationTimestamp="2026-02-01 06:44:54 +0000 UTC" firstStartedPulling="2026-02-01 06:44:56.43918366 +0000 UTC m=+127.090119676" lastFinishedPulling="2026-02-01 06:45:25.45062302 +0000 UTC m=+156.101559035" observedRunningTime="2026-02-01 06:45:26.027680496 +0000 UTC m=+156.678616502" watchObservedRunningTime="2026-02-01 06:45:26.028263182 +0000 UTC m=+156.679199198" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.421460 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.424137 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.446352 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca3c024-0f0b-4651-8eb7-9a7e0511739c-metrics-certs\") pod \"network-metrics-daemon-8tdck\" (UID: \"1ca3c024-0f0b-4651-8eb7-9a7e0511739c\") " pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.485612 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.496612 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8tdck" Feb 01 06:45:30 crc kubenswrapper[4546]: I0201 06:45:30.946489 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8tdck"] Feb 01 06:45:31 crc kubenswrapper[4546]: I0201 06:45:31.013249 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tdck" event={"ID":"1ca3c024-0f0b-4651-8eb7-9a7e0511739c","Type":"ContainerStarted","Data":"966607c4fa16847ae23a74880d9d5d37d37f2f2257d175c4654e7cf4e1b8db3a"} Feb 01 06:45:31 crc kubenswrapper[4546]: I0201 06:45:31.662139 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:45:31 crc kubenswrapper[4546]: I0201 06:45:31.662487 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:45:31 crc kubenswrapper[4546]: I0201 06:45:31.729374 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.022357 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tdck" event={"ID":"1ca3c024-0f0b-4651-8eb7-9a7e0511739c","Type":"ContainerStarted","Data":"6f05cf283a19013f2c72b52abd1d2acea30f870a476d4bee5768d9f5cd261911"} Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.022413 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8tdck" event={"ID":"1ca3c024-0f0b-4651-8eb7-9a7e0511739c","Type":"ContainerStarted","Data":"7c27b299779232278890c948c3ea4b7820f582fb3eb173167ea51eaffd8c8283"} Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.038075 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8tdck" podStartSLOduration=144.03805199 podStartE2EDuration="2m24.03805199s" podCreationTimestamp="2026-02-01 06:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:45:32.034315564 +0000 UTC m=+162.685251581" watchObservedRunningTime="2026-02-01 06:45:32.03805199 +0000 UTC m=+162.688987996" Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.058007 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.058772 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.076471 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:45:32 crc kubenswrapper[4546]: I0201 06:45:32.119503 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:33 crc kubenswrapper[4546]: I0201 06:45:33.061836 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:33 crc kubenswrapper[4546]: I0201 06:45:33.975784 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:45:34 crc kubenswrapper[4546]: I0201 06:45:34.998906 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:45:34 crc kubenswrapper[4546]: I0201 06:45:34.998965 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.034844 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.040155 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gbmsq" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="registry-server" containerID="cri-o://dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87" gracePeriod=2 Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.095713 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.417187 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.417236 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.454367 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.488513 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.587884 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n47v4\" (UniqueName: \"kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4\") pod \"44bb6d15-c261-475e-9978-1d1495b630eb\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.588365 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities\") pod \"44bb6d15-c261-475e-9978-1d1495b630eb\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.588702 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content\") pod \"44bb6d15-c261-475e-9978-1d1495b630eb\" (UID: \"44bb6d15-c261-475e-9978-1d1495b630eb\") " Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.589056 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities" (OuterVolumeSpecName: "utilities") pod "44bb6d15-c261-475e-9978-1d1495b630eb" (UID: "44bb6d15-c261-475e-9978-1d1495b630eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.589408 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.595288 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4" (OuterVolumeSpecName: "kube-api-access-n47v4") pod "44bb6d15-c261-475e-9978-1d1495b630eb" (UID: "44bb6d15-c261-475e-9978-1d1495b630eb"). InnerVolumeSpecName "kube-api-access-n47v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.636212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44bb6d15-c261-475e-9978-1d1495b630eb" (UID: "44bb6d15-c261-475e-9978-1d1495b630eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.691130 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bb6d15-c261-475e-9978-1d1495b630eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:35 crc kubenswrapper[4546]: I0201 06:45:35.691157 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n47v4\" (UniqueName: \"kubernetes.io/projected/44bb6d15-c261-475e-9978-1d1495b630eb-kube-api-access-n47v4\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.051307 4546 generic.go:334] "Generic (PLEG): container finished" podID="44bb6d15-c261-475e-9978-1d1495b630eb" containerID="dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87" exitCode=0 Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.052095 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbmsq" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.052605 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerDied","Data":"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87"} Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.052657 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbmsq" event={"ID":"44bb6d15-c261-475e-9978-1d1495b630eb","Type":"ContainerDied","Data":"bc6d850a0a54e5178320210026d94ad8b4d458dc13e83224c7b8ae10ce247407"} Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.052685 4546 scope.go:117] "RemoveContainer" containerID="dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.083797 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.084676 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gbmsq"] Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.088969 4546 scope.go:117] "RemoveContainer" containerID="7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.095127 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.104114 4546 scope.go:117] "RemoveContainer" containerID="f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.119638 4546 scope.go:117] "RemoveContainer" containerID="dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87" Feb 01 06:45:36 crc kubenswrapper[4546]: E0201 06:45:36.120083 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87\": container with ID starting with dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87 not found: ID does not exist" containerID="dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.120118 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87"} err="failed to get container status \"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87\": rpc error: code = NotFound desc = could not find container \"dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87\": container with ID starting with dd6c57f478c0ae5b5dccb413bfbe0ec36d7149451b7fb684908e922915cd0a87 not found: ID does not exist" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.120162 4546 scope.go:117] "RemoveContainer" containerID="7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96" Feb 01 06:45:36 crc kubenswrapper[4546]: E0201 06:45:36.120502 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96\": container with ID starting with 7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96 not found: ID does not exist" containerID="7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.120532 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96"} err="failed to get container status \"7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96\": rpc error: code = NotFound desc = could not find container \"7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96\": container with ID starting with 7320addd039d5f6ce6fc7236d5be8169fbb559f2b518d066d200a2001380ea96 not found: ID does not exist" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.120552 4546 scope.go:117] "RemoveContainer" containerID="f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93" Feb 01 06:45:36 crc kubenswrapper[4546]: E0201 06:45:36.120921 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93\": container with ID starting with f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93 not found: ID does not exist" containerID="f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93" Feb 01 06:45:36 crc kubenswrapper[4546]: I0201 06:45:36.120946 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93"} err="failed to get container status \"f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93\": rpc error: code = NotFound desc = could not find container \"f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93\": container with ID starting with f769a075f04ef4417944662b7139f4e528ca55b7f3228c1de89880b07c410d93 not found: ID does not exist" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.172296 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485535 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:45:37 crc kubenswrapper[4546]: E0201 06:45:37.485764 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" containerName="collect-profiles" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485781 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" containerName="collect-profiles" Feb 01 06:45:37 crc kubenswrapper[4546]: E0201 06:45:37.485795 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="extract-utilities" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485803 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="extract-utilities" Feb 01 06:45:37 crc kubenswrapper[4546]: E0201 06:45:37.485811 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="extract-content" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485817 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="extract-content" Feb 01 06:45:37 crc kubenswrapper[4546]: E0201 06:45:37.485824 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="registry-server" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485830 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="registry-server" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485947 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" containerName="collect-profiles" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.485958 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" containerName="registry-server" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.486354 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.488795 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.489209 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.503256 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.513059 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.513396 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.614172 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.614286 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.614572 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.632509 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.663226 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bb6d15-c261-475e-9978-1d1495b630eb" path="/var/lib/kubelet/pods/44bb6d15-c261-475e-9978-1d1495b630eb/volumes" Feb 01 06:45:37 crc kubenswrapper[4546]: I0201 06:45:37.796762 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.066637 4546 generic.go:334] "Generic (PLEG): container finished" podID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerID="65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3" exitCode=0 Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.066719 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerDied","Data":"65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3"} Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.067170 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qqn2r" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="registry-server" containerID="cri-o://fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752" gracePeriod=2 Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.170793 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:45:38 crc kubenswrapper[4546]: W0201 06:45:38.187770 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1971b9b0_7683_44af_b491_ab8e06c2dd8c.slice/crio-9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231 WatchSource:0}: Error finding container 9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231: Status 404 returned error can't find the container with id 9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231 Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.469675 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.627805 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content\") pod \"91bca231-e9f9-42b7-aa32-db383a098b5b\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.627946 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftchc\" (UniqueName: \"kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc\") pod \"91bca231-e9f9-42b7-aa32-db383a098b5b\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.627996 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities\") pod \"91bca231-e9f9-42b7-aa32-db383a098b5b\" (UID: \"91bca231-e9f9-42b7-aa32-db383a098b5b\") " Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.628672 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities" (OuterVolumeSpecName: "utilities") pod "91bca231-e9f9-42b7-aa32-db383a098b5b" (UID: "91bca231-e9f9-42b7-aa32-db383a098b5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.638122 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc" (OuterVolumeSpecName: "kube-api-access-ftchc") pod "91bca231-e9f9-42b7-aa32-db383a098b5b" (UID: "91bca231-e9f9-42b7-aa32-db383a098b5b"). InnerVolumeSpecName "kube-api-access-ftchc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.729394 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91bca231-e9f9-42b7-aa32-db383a098b5b" (UID: "91bca231-e9f9-42b7-aa32-db383a098b5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.730514 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftchc\" (UniqueName: \"kubernetes.io/projected/91bca231-e9f9-42b7-aa32-db383a098b5b-kube-api-access-ftchc\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.730549 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:38 crc kubenswrapper[4546]: I0201 06:45:38.730560 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91bca231-e9f9-42b7-aa32-db383a098b5b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.077239 4546 generic.go:334] "Generic (PLEG): container finished" podID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerID="fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752" exitCode=0 Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.077367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerDied","Data":"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.077472 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqn2r" event={"ID":"91bca231-e9f9-42b7-aa32-db383a098b5b","Type":"ContainerDied","Data":"38aee8309b676e82a8e79d50d6d0ba7fa35e07ba815ed877aa5bf89cdcba44ba"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.077524 4546 scope.go:117] "RemoveContainer" containerID="fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.077715 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqn2r" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.080801 4546 generic.go:334] "Generic (PLEG): container finished" podID="1971b9b0-7683-44af-b491-ab8e06c2dd8c" containerID="fbd0721eabd30727c15b4c3d35639c54b7064cdcba95d0539bb8903960bffa30" exitCode=0 Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.081011 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1971b9b0-7683-44af-b491-ab8e06c2dd8c","Type":"ContainerDied","Data":"fbd0721eabd30727c15b4c3d35639c54b7064cdcba95d0539bb8903960bffa30"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.081040 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1971b9b0-7683-44af-b491-ab8e06c2dd8c","Type":"ContainerStarted","Data":"9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.082948 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerStarted","Data":"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.087836 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerID="4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3" exitCode=0 Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.087937 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerDied","Data":"4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.099414 4546 scope.go:117] "RemoveContainer" containerID="139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.099712 4546 generic.go:334] "Generic (PLEG): container finished" podID="47612608-8394-4713-b59a-172469b14bbc" containerID="af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0" exitCode=0 Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.099745 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerDied","Data":"af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0"} Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.114070 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.116754 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qqn2r"] Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.124362 4546 scope.go:117] "RemoveContainer" containerID="6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.135917 4546 scope.go:117] "RemoveContainer" containerID="fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752" Feb 01 06:45:39 crc kubenswrapper[4546]: E0201 06:45:39.136260 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752\": container with ID starting with fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752 not found: ID does not exist" containerID="fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.136297 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752"} err="failed to get container status \"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752\": rpc error: code = NotFound desc = could not find container \"fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752\": container with ID starting with fd287abcf1674bdeae8bc4ad804f05bcd424757efb680dc63f6c5f71ddb28752 not found: ID does not exist" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.136320 4546 scope.go:117] "RemoveContainer" containerID="139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790" Feb 01 06:45:39 crc kubenswrapper[4546]: E0201 06:45:39.136610 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790\": container with ID starting with 139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790 not found: ID does not exist" containerID="139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.136631 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790"} err="failed to get container status \"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790\": rpc error: code = NotFound desc = could not find container \"139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790\": container with ID starting with 139bdc9df10ec4b441cc65d4c750ad8a2e04f6dc4be06e47fecb39517f950790 not found: ID does not exist" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.136642 4546 scope.go:117] "RemoveContainer" containerID="6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da" Feb 01 06:45:39 crc kubenswrapper[4546]: E0201 06:45:39.136961 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da\": container with ID starting with 6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da not found: ID does not exist" containerID="6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.136980 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da"} err="failed to get container status \"6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da\": rpc error: code = NotFound desc = could not find container \"6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da\": container with ID starting with 6806c31d3af6640083656f27c9712d231c5e66475f0920951e69a366d50c73da not found: ID does not exist" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.140541 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzplm" podStartSLOduration=3.826839133 podStartE2EDuration="48.140531078s" podCreationTimestamp="2026-02-01 06:44:51 +0000 UTC" firstStartedPulling="2026-02-01 06:44:54.23341696 +0000 UTC m=+124.884352966" lastFinishedPulling="2026-02-01 06:45:38.547108895 +0000 UTC m=+169.198044911" observedRunningTime="2026-02-01 06:45:39.139368218 +0000 UTC m=+169.790304234" watchObservedRunningTime="2026-02-01 06:45:39.140531078 +0000 UTC m=+169.791467094" Feb 01 06:45:39 crc kubenswrapper[4546]: I0201 06:45:39.661198 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" path="/var/lib/kubelet/pods/91bca231-e9f9-42b7-aa32-db383a098b5b/volumes" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.108838 4546 generic.go:334] "Generic (PLEG): container finished" podID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerID="e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b" exitCode=0 Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.109298 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerDied","Data":"e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b"} Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.113196 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerStarted","Data":"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1"} Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.115866 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerStarted","Data":"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db"} Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.156741 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fkgtj" podStartSLOduration=2.6756579560000002 podStartE2EDuration="49.156722912s" podCreationTimestamp="2026-02-01 06:44:51 +0000 UTC" firstStartedPulling="2026-02-01 06:44:53.188300087 +0000 UTC m=+123.839236103" lastFinishedPulling="2026-02-01 06:45:39.669365043 +0000 UTC m=+170.320301059" observedRunningTime="2026-02-01 06:45:40.155890484 +0000 UTC m=+170.806826501" watchObservedRunningTime="2026-02-01 06:45:40.156722912 +0000 UTC m=+170.807658927" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.171366 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vr449" podStartSLOduration=1.851439939 podStartE2EDuration="47.171343935s" podCreationTimestamp="2026-02-01 06:44:53 +0000 UTC" firstStartedPulling="2026-02-01 06:44:54.242384628 +0000 UTC m=+124.893320644" lastFinishedPulling="2026-02-01 06:45:39.562288623 +0000 UTC m=+170.213224640" observedRunningTime="2026-02-01 06:45:40.168116858 +0000 UTC m=+170.819052874" watchObservedRunningTime="2026-02-01 06:45:40.171343935 +0000 UTC m=+170.822279941" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.403691 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.553785 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access\") pod \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.553885 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir\") pod \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\" (UID: \"1971b9b0-7683-44af-b491-ab8e06c2dd8c\") " Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.554179 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1971b9b0-7683-44af-b491-ab8e06c2dd8c" (UID: "1971b9b0-7683-44af-b491-ab8e06c2dd8c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.561157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1971b9b0-7683-44af-b491-ab8e06c2dd8c" (UID: "1971b9b0-7683-44af-b491-ab8e06c2dd8c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.655549 4546 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:40 crc kubenswrapper[4546]: I0201 06:45:40.655589 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1971b9b0-7683-44af-b491-ab8e06c2dd8c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.125062 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1971b9b0-7683-44af-b491-ab8e06c2dd8c","Type":"ContainerDied","Data":"9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231"} Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.125120 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f42ae638841cd1422c5d6e218b71c78c069197f22617603adeb5ac126899231" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.125153 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.126802 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerStarted","Data":"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc"} Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.836833 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.837173 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.869740 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:45:41 crc kubenswrapper[4546]: I0201 06:45:41.886086 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9w4m8" podStartSLOduration=3.560323894 podStartE2EDuration="48.886068298s" podCreationTimestamp="2026-02-01 06:44:53 +0000 UTC" firstStartedPulling="2026-02-01 06:44:55.362659794 +0000 UTC m=+126.013595810" lastFinishedPulling="2026-02-01 06:45:40.688404198 +0000 UTC m=+171.339340214" observedRunningTime="2026-02-01 06:45:41.144200072 +0000 UTC m=+171.795136088" watchObservedRunningTime="2026-02-01 06:45:41.886068298 +0000 UTC m=+172.537004313" Feb 01 06:45:42 crc kubenswrapper[4546]: I0201 06:45:42.282891 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:42 crc kubenswrapper[4546]: I0201 06:45:42.282948 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:42 crc kubenswrapper[4546]: I0201 06:45:42.316242 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:43 crc kubenswrapper[4546]: I0201 06:45:43.174963 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:43 crc kubenswrapper[4546]: I0201 06:45:43.617122 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:45:43 crc kubenswrapper[4546]: I0201 06:45:43.617270 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:45:43 crc kubenswrapper[4546]: I0201 06:45:43.649399 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.043346 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.043403 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.076797 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087420 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:45:44 crc kubenswrapper[4546]: E0201 06:45:44.087644 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="extract-content" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087662 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="extract-content" Feb 01 06:45:44 crc kubenswrapper[4546]: E0201 06:45:44.087676 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="registry-server" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087684 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="registry-server" Feb 01 06:45:44 crc kubenswrapper[4546]: E0201 06:45:44.087694 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1971b9b0-7683-44af-b491-ab8e06c2dd8c" containerName="pruner" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087699 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1971b9b0-7683-44af-b491-ab8e06c2dd8c" containerName="pruner" Feb 01 06:45:44 crc kubenswrapper[4546]: E0201 06:45:44.087709 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="extract-utilities" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087716 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="extract-utilities" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087809 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1971b9b0-7683-44af-b491-ab8e06c2dd8c" containerName="pruner" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.087820 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="91bca231-e9f9-42b7-aa32-db383a098b5b" containerName="registry-server" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.088390 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.090046 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.090430 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.100360 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.101803 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.101836 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.101868 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.168620 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.204771 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.204849 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.204897 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.204928 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.204945 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.243769 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access\") pod \"installer-9-crc\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.401174 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:45:44 crc kubenswrapper[4546]: I0201 06:45:44.775605 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:45:44 crc kubenswrapper[4546]: W0201 06:45:44.779885 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb169bb8d_05fd_433a_ab97_3433c3cb42d3.slice/crio-f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58 WatchSource:0}: Error finding container f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58: Status 404 returned error can't find the container with id f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58 Feb 01 06:45:45 crc kubenswrapper[4546]: I0201 06:45:45.148505 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b169bb8d-05fd-433a-ab97-3433c3cb42d3","Type":"ContainerStarted","Data":"d177d8be60be8b9a2382c4ddc95879b7557bdb7c42e14862cce71ff401b15c0b"} Feb 01 06:45:45 crc kubenswrapper[4546]: I0201 06:45:45.148787 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b169bb8d-05fd-433a-ab97-3433c3cb42d3","Type":"ContainerStarted","Data":"f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58"} Feb 01 06:45:45 crc kubenswrapper[4546]: I0201 06:45:45.163128 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.163115511 podStartE2EDuration="1.163115511s" podCreationTimestamp="2026-02-01 06:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:45:45.162334983 +0000 UTC m=+175.813270998" watchObservedRunningTime="2026-02-01 06:45:45.163115511 +0000 UTC m=+175.814051527" Feb 01 06:45:47 crc kubenswrapper[4546]: I0201 06:45:47.556846 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:45:47 crc kubenswrapper[4546]: I0201 06:45:47.557340 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzplm" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="registry-server" containerID="cri-o://4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139" gracePeriod=2 Feb 01 06:45:47 crc kubenswrapper[4546]: I0201 06:45:47.896965 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.058247 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content\") pod \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.058336 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities\") pod \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.058372 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8npnn\" (UniqueName: \"kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn\") pod \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\" (UID: \"6d411dc4-ef2d-4e39-9111-e2ae62f83b37\") " Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.059804 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities" (OuterVolumeSpecName: "utilities") pod "6d411dc4-ef2d-4e39-9111-e2ae62f83b37" (UID: "6d411dc4-ef2d-4e39-9111-e2ae62f83b37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.064360 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn" (OuterVolumeSpecName: "kube-api-access-8npnn") pod "6d411dc4-ef2d-4e39-9111-e2ae62f83b37" (UID: "6d411dc4-ef2d-4e39-9111-e2ae62f83b37"). InnerVolumeSpecName "kube-api-access-8npnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.102489 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d411dc4-ef2d-4e39-9111-e2ae62f83b37" (UID: "6d411dc4-ef2d-4e39-9111-e2ae62f83b37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.159174 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.159322 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.159403 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8npnn\" (UniqueName: \"kubernetes.io/projected/6d411dc4-ef2d-4e39-9111-e2ae62f83b37-kube-api-access-8npnn\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.167540 4546 generic.go:334] "Generic (PLEG): container finished" podID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerID="4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139" exitCode=0 Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.167658 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerDied","Data":"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139"} Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.167744 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzplm" event={"ID":"6d411dc4-ef2d-4e39-9111-e2ae62f83b37","Type":"ContainerDied","Data":"f3487d969061c2fa99d277b1c0e05a1dbede2e7d98c0eb8a0bc9cb9e845bd247"} Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.167815 4546 scope.go:117] "RemoveContainer" containerID="4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.167991 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzplm" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.187005 4546 scope.go:117] "RemoveContainer" containerID="65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.192865 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.196154 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzplm"] Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.212742 4546 scope.go:117] "RemoveContainer" containerID="61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.233980 4546 scope.go:117] "RemoveContainer" containerID="4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139" Feb 01 06:45:48 crc kubenswrapper[4546]: E0201 06:45:48.234401 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139\": container with ID starting with 4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139 not found: ID does not exist" containerID="4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.234434 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139"} err="failed to get container status \"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139\": rpc error: code = NotFound desc = could not find container \"4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139\": container with ID starting with 4076e0b0e4bb0aba4c58686cdc042efce037e7f35a9b9cd93b9082493e3d7139 not found: ID does not exist" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.234459 4546 scope.go:117] "RemoveContainer" containerID="65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3" Feb 01 06:45:48 crc kubenswrapper[4546]: E0201 06:45:48.234875 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3\": container with ID starting with 65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3 not found: ID does not exist" containerID="65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.234911 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3"} err="failed to get container status \"65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3\": rpc error: code = NotFound desc = could not find container \"65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3\": container with ID starting with 65a86eef12b5cdc1182ca70d39a70ac135496398404779ba7ba410a273dc48b3 not found: ID does not exist" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.234937 4546 scope.go:117] "RemoveContainer" containerID="61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648" Feb 01 06:45:48 crc kubenswrapper[4546]: E0201 06:45:48.235418 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648\": container with ID starting with 61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648 not found: ID does not exist" containerID="61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648" Feb 01 06:45:48 crc kubenswrapper[4546]: I0201 06:45:48.235439 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648"} err="failed to get container status \"61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648\": rpc error: code = NotFound desc = could not find container \"61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648\": container with ID starting with 61cf22af29bc55efebbbdfb55d0d64313d417dcd94f99f612c00159e2e378648 not found: ID does not exist" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.288887 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerName="oauth-openshift" containerID="cri-o://914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789" gracePeriod=15 Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.611041 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.675034 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" path="/var/lib/kubelet/pods/6d411dc4-ef2d-4e39-9111-e2ae62f83b37/volumes" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684229 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684278 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64rjr\" (UniqueName: \"kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684314 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684341 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684336 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684393 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684433 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684448 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684479 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684508 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684522 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684578 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684596 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684615 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684647 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data\") pod \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\" (UID: \"a6d2f6da-ac32-41d4-b1bb-ed5c96364254\") " Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.684939 4546 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.685346 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.689453 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.690823 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.691193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.692813 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.693462 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.695609 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.695986 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr" (OuterVolumeSpecName: "kube-api-access-64rjr") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "kube-api-access-64rjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.696515 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.696749 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.697078 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.697121 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.707879 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a6d2f6da-ac32-41d4-b1bb-ed5c96364254" (UID: "a6d2f6da-ac32-41d4-b1bb-ed5c96364254"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785848 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64rjr\" (UniqueName: \"kubernetes.io/projected/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-kube-api-access-64rjr\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785888 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785899 4546 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785909 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785919 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785948 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785957 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785966 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785974 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785983 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.785993 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.786003 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:49 crc kubenswrapper[4546]: I0201 06:45:49.786011 4546 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d2f6da-ac32-41d4-b1bb-ed5c96364254-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.192338 4546 generic.go:334] "Generic (PLEG): container finished" podID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerID="914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789" exitCode=0 Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.192400 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" event={"ID":"a6d2f6da-ac32-41d4-b1bb-ed5c96364254","Type":"ContainerDied","Data":"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789"} Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.192431 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" event={"ID":"a6d2f6da-ac32-41d4-b1bb-ed5c96364254","Type":"ContainerDied","Data":"af7ae06f4bb919b6b28353e9de5eacc33251c26b5995870d66adf7afd96fa441"} Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.192447 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9n59f" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.192455 4546 scope.go:117] "RemoveContainer" containerID="914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.216124 4546 scope.go:117] "RemoveContainer" containerID="914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789" Feb 01 06:45:50 crc kubenswrapper[4546]: E0201 06:45:50.216820 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789\": container with ID starting with 914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789 not found: ID does not exist" containerID="914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.216990 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789"} err="failed to get container status \"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789\": rpc error: code = NotFound desc = could not find container \"914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789\": container with ID starting with 914d95b5deb1cf9a5ffdd07b12d1a1537e86407545a4834f2ae74d0b485bf789 not found: ID does not exist" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.221879 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.227891 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9n59f"] Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.611261 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-94z5v"] Feb 01 06:45:50 crc kubenswrapper[4546]: E0201 06:45:50.611949 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="extract-content" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.611973 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="extract-content" Feb 01 06:45:50 crc kubenswrapper[4546]: E0201 06:45:50.611990 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerName="oauth-openshift" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.611997 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerName="oauth-openshift" Feb 01 06:45:50 crc kubenswrapper[4546]: E0201 06:45:50.612009 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="extract-utilities" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.612018 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="extract-utilities" Feb 01 06:45:50 crc kubenswrapper[4546]: E0201 06:45:50.612028 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="registry-server" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.612034 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="registry-server" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.612164 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d411dc4-ef2d-4e39-9111-e2ae62f83b37" containerName="registry-server" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.612179 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" containerName="oauth-openshift" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.612716 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.620871 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621413 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621416 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621587 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621646 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621416 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.621502 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.622079 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.622358 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.623345 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.624294 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.624533 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.625042 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.631923 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-94z5v"] Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.636361 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.651940 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797549 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797617 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797649 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797670 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9ff\" (UniqueName: \"kubernetes.io/projected/6d85c342-0039-4a60-a1cc-05a839fbe0e4-kube-api-access-hh9ff\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797698 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.797947 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798042 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798114 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798148 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798178 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798206 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-dir\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798232 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798420 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.798496 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-policies\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.899910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.899967 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.899993 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900012 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9ff\" (UniqueName: \"kubernetes.io/projected/6d85c342-0039-4a60-a1cc-05a839fbe0e4-kube-api-access-hh9ff\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900033 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900070 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900096 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900119 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900136 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900154 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900176 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900195 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-dir\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900236 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900253 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-policies\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.900649 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-dir\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.901026 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-audit-policies\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.901477 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.901907 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.906235 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.906255 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.906587 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.906769 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.907391 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.909297 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.909954 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.911241 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.912609 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6d85c342-0039-4a60-a1cc-05a839fbe0e4-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.920027 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9ff\" (UniqueName: \"kubernetes.io/projected/6d85c342-0039-4a60-a1cc-05a839fbe0e4-kube-api-access-hh9ff\") pod \"oauth-openshift-6b9699fff8-94z5v\" (UID: \"6d85c342-0039-4a60-a1cc-05a839fbe0e4\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:50 crc kubenswrapper[4546]: I0201 06:45:50.934384 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:51 crc kubenswrapper[4546]: I0201 06:45:51.298335 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-94z5v"] Feb 01 06:45:51 crc kubenswrapper[4546]: W0201 06:45:51.306125 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d85c342_0039_4a60_a1cc_05a839fbe0e4.slice/crio-6d203111f31a19937b95a710492ecfb53869612777321187b7f08d9d24f46233 WatchSource:0}: Error finding container 6d203111f31a19937b95a710492ecfb53869612777321187b7f08d9d24f46233: Status 404 returned error can't find the container with id 6d203111f31a19937b95a710492ecfb53869612777321187b7f08d9d24f46233 Feb 01 06:45:51 crc kubenswrapper[4546]: I0201 06:45:51.662101 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d2f6da-ac32-41d4-b1bb-ed5c96364254" path="/var/lib/kubelet/pods/a6d2f6da-ac32-41d4-b1bb-ed5c96364254/volumes" Feb 01 06:45:51 crc kubenswrapper[4546]: I0201 06:45:51.879351 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:45:52 crc kubenswrapper[4546]: I0201 06:45:52.207305 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" event={"ID":"6d85c342-0039-4a60-a1cc-05a839fbe0e4","Type":"ContainerStarted","Data":"43acff97979dab7835261d9230f5fea832035254461f4948e4d24c586a0f0970"} Feb 01 06:45:52 crc kubenswrapper[4546]: I0201 06:45:52.207373 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" event={"ID":"6d85c342-0039-4a60-a1cc-05a839fbe0e4","Type":"ContainerStarted","Data":"6d203111f31a19937b95a710492ecfb53869612777321187b7f08d9d24f46233"} Feb 01 06:45:52 crc kubenswrapper[4546]: I0201 06:45:52.207554 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:52 crc kubenswrapper[4546]: I0201 06:45:52.213617 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" Feb 01 06:45:52 crc kubenswrapper[4546]: I0201 06:45:52.232052 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b9699fff8-94z5v" podStartSLOduration=28.232022019 podStartE2EDuration="28.232022019s" podCreationTimestamp="2026-02-01 06:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:45:52.228078943 +0000 UTC m=+182.879014959" watchObservedRunningTime="2026-02-01 06:45:52.232022019 +0000 UTC m=+182.882958025" Feb 01 06:45:54 crc kubenswrapper[4546]: I0201 06:45:54.076767 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:55 crc kubenswrapper[4546]: I0201 06:45:55.420774 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:45:55 crc kubenswrapper[4546]: I0201 06:45:55.420847 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:45:55 crc kubenswrapper[4546]: I0201 06:45:55.783109 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.558797 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.559341 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9w4m8" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="registry-server" containerID="cri-o://d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc" gracePeriod=2 Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.902158 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.998270 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content\") pod \"6378c03c-77b0-4d0d-8dd3-2b789468177a\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.998386 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wv4r\" (UniqueName: \"kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r\") pod \"6378c03c-77b0-4d0d-8dd3-2b789468177a\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.998534 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities\") pod \"6378c03c-77b0-4d0d-8dd3-2b789468177a\" (UID: \"6378c03c-77b0-4d0d-8dd3-2b789468177a\") " Feb 01 06:45:57 crc kubenswrapper[4546]: I0201 06:45:57.999569 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities" (OuterVolumeSpecName: "utilities") pod "6378c03c-77b0-4d0d-8dd3-2b789468177a" (UID: "6378c03c-77b0-4d0d-8dd3-2b789468177a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.005556 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r" (OuterVolumeSpecName: "kube-api-access-2wv4r") pod "6378c03c-77b0-4d0d-8dd3-2b789468177a" (UID: "6378c03c-77b0-4d0d-8dd3-2b789468177a"). InnerVolumeSpecName "kube-api-access-2wv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.018109 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6378c03c-77b0-4d0d-8dd3-2b789468177a" (UID: "6378c03c-77b0-4d0d-8dd3-2b789468177a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.099875 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wv4r\" (UniqueName: \"kubernetes.io/projected/6378c03c-77b0-4d0d-8dd3-2b789468177a-kube-api-access-2wv4r\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.099907 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.099920 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6378c03c-77b0-4d0d-8dd3-2b789468177a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.246090 4546 generic.go:334] "Generic (PLEG): container finished" podID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerID="d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc" exitCode=0 Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.246149 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerDied","Data":"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc"} Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.246187 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w4m8" event={"ID":"6378c03c-77b0-4d0d-8dd3-2b789468177a","Type":"ContainerDied","Data":"47986c449b6af641f4ce9c6a566d8ddab2be8d9a6c3c52fdbce0cad60cd976fc"} Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.246196 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w4m8" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.246208 4546 scope.go:117] "RemoveContainer" containerID="d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.266557 4546 scope.go:117] "RemoveContainer" containerID="e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.284308 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.292143 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w4m8"] Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.294149 4546 scope.go:117] "RemoveContainer" containerID="a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.306736 4546 scope.go:117] "RemoveContainer" containerID="d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc" Feb 01 06:45:58 crc kubenswrapper[4546]: E0201 06:45:58.307183 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc\": container with ID starting with d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc not found: ID does not exist" containerID="d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.307318 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc"} err="failed to get container status \"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc\": rpc error: code = NotFound desc = could not find container \"d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc\": container with ID starting with d7bd45f6378626f453399fc9b87fc54b863d1ae44d7d58dc6f51e544d63293dc not found: ID does not exist" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.307424 4546 scope.go:117] "RemoveContainer" containerID="e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b" Feb 01 06:45:58 crc kubenswrapper[4546]: E0201 06:45:58.308649 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b\": container with ID starting with e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b not found: ID does not exist" containerID="e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.308751 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b"} err="failed to get container status \"e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b\": rpc error: code = NotFound desc = could not find container \"e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b\": container with ID starting with e816be2923c3c93d39e416fe65a3fb302f6f4eb8fafab637ead47796a23b570b not found: ID does not exist" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.308830 4546 scope.go:117] "RemoveContainer" containerID="a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06" Feb 01 06:45:58 crc kubenswrapper[4546]: E0201 06:45:58.309297 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06\": container with ID starting with a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06 not found: ID does not exist" containerID="a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06" Feb 01 06:45:58 crc kubenswrapper[4546]: I0201 06:45:58.309344 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06"} err="failed to get container status \"a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06\": rpc error: code = NotFound desc = could not find container \"a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06\": container with ID starting with a9c4d45189569639021595f220f55782138eac3fea322e222b123a3ee6456b06 not found: ID does not exist" Feb 01 06:45:59 crc kubenswrapper[4546]: I0201 06:45:59.662576 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" path="/var/lib/kubelet/pods/6378c03c-77b0-4d0d-8dd3-2b789468177a/volumes" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.585210 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.586252 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fkgtj" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="registry-server" containerID="cri-o://4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db" gracePeriod=30 Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.595363 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.595630 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kn94x" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="registry-server" containerID="cri-o://a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775" gracePeriod=30 Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.611975 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.612276 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" containerID="cri-o://777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4" gracePeriod=30 Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.617791 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.618051 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vr449" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="registry-server" containerID="cri-o://68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1" gracePeriod=30 Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.622347 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.622538 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fknp6" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="registry-server" containerID="cri-o://7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397" gracePeriod=30 Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636008 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tsqn"] Feb 01 06:46:05 crc kubenswrapper[4546]: E0201 06:46:05.636247 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="extract-content" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636267 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="extract-content" Feb 01 06:46:05 crc kubenswrapper[4546]: E0201 06:46:05.636283 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="registry-server" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636289 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="registry-server" Feb 01 06:46:05 crc kubenswrapper[4546]: E0201 06:46:05.636298 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="extract-utilities" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636305 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="extract-utilities" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636418 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6378c03c-77b0-4d0d-8dd3-2b789468177a" containerName="registry-server" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.636814 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.646205 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tsqn"] Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.698524 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.698682 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.698783 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxlq\" (UniqueName: \"kubernetes.io/projected/dd2a4a8d-fc02-4822-979d-5c1f30653add-kube-api-access-qlxlq\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.724712 4546 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bs98t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.724774 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.799547 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxlq\" (UniqueName: \"kubernetes.io/projected/dd2a4a8d-fc02-4822-979d-5c1f30653add-kube-api-access-qlxlq\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.799624 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.799688 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.801232 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.809511 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd2a4a8d-fc02-4822-979d-5c1f30653add-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.823025 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxlq\" (UniqueName: \"kubernetes.io/projected/dd2a4a8d-fc02-4822-979d-5c1f30653add-kube-api-access-qlxlq\") pod \"marketplace-operator-79b997595-2tsqn\" (UID: \"dd2a4a8d-fc02-4822-979d-5c1f30653add\") " pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.884152 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:05 crc kubenswrapper[4546]: I0201 06:46:05.969734 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.106484 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89lh8\" (UniqueName: \"kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8\") pod \"47612608-8394-4713-b59a-172469b14bbc\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.106604 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content\") pod \"47612608-8394-4713-b59a-172469b14bbc\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.106734 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities\") pod \"47612608-8394-4713-b59a-172469b14bbc\" (UID: \"47612608-8394-4713-b59a-172469b14bbc\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.108574 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities" (OuterVolumeSpecName: "utilities") pod "47612608-8394-4713-b59a-172469b14bbc" (UID: "47612608-8394-4713-b59a-172469b14bbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.113696 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8" (OuterVolumeSpecName: "kube-api-access-89lh8") pod "47612608-8394-4713-b59a-172469b14bbc" (UID: "47612608-8394-4713-b59a-172469b14bbc"). InnerVolumeSpecName "kube-api-access-89lh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.118576 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.130806 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.141132 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.147541 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.193294 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47612608-8394-4713-b59a-172469b14bbc" (UID: "47612608-8394-4713-b59a-172469b14bbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210219 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content\") pod \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210275 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") pod \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210315 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content\") pod \"318d8499-a380-4204-b4ee-15d2692874e3\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210342 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwh2\" (UniqueName: \"kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2\") pod \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210379 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities\") pod \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210411 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj7dr\" (UniqueName: \"kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr\") pod \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210449 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities\") pod \"318d8499-a380-4204-b4ee-15d2692874e3\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210483 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content\") pod \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210515 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") pod \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\" (UID: \"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210547 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdnzz\" (UniqueName: \"kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz\") pod \"318d8499-a380-4204-b4ee-15d2692874e3\" (UID: \"318d8499-a380-4204-b4ee-15d2692874e3\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.210595 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxbd5\" (UniqueName: \"kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5\") pod \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\" (UID: \"bca0710d-d2ea-4726-84bb-0bf49d93a63a\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.211016 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.211042 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47612608-8394-4713-b59a-172469b14bbc-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.211053 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89lh8\" (UniqueName: \"kubernetes.io/projected/47612608-8394-4713-b59a-172469b14bbc-kube-api-access-89lh8\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.213756 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" (UID: "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.213809 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities" (OuterVolumeSpecName: "utilities") pod "318d8499-a380-4204-b4ee-15d2692874e3" (UID: "318d8499-a380-4204-b4ee-15d2692874e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.214235 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities" (OuterVolumeSpecName: "utilities") pod "bca0710d-d2ea-4726-84bb-0bf49d93a63a" (UID: "bca0710d-d2ea-4726-84bb-0bf49d93a63a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.232805 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz" (OuterVolumeSpecName: "kube-api-access-bdnzz") pod "318d8499-a380-4204-b4ee-15d2692874e3" (UID: "318d8499-a380-4204-b4ee-15d2692874e3"). InnerVolumeSpecName "kube-api-access-bdnzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.233122 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5" (OuterVolumeSpecName: "kube-api-access-lxbd5") pod "bca0710d-d2ea-4726-84bb-0bf49d93a63a" (UID: "bca0710d-d2ea-4726-84bb-0bf49d93a63a"). InnerVolumeSpecName "kube-api-access-lxbd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.233391 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" (UID: "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.235212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr" (OuterVolumeSpecName: "kube-api-access-lj7dr") pod "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" (UID: "ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697"). InnerVolumeSpecName "kube-api-access-lj7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.236845 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2" (OuterVolumeSpecName: "kube-api-access-vwwh2") pod "a4096fe8-44f5-466f-9d1c-9d32a9f7396e" (UID: "a4096fe8-44f5-466f-9d1c-9d32a9f7396e"). InnerVolumeSpecName "kube-api-access-vwwh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.245048 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4096fe8-44f5-466f-9d1c-9d32a9f7396e" (UID: "a4096fe8-44f5-466f-9d1c-9d32a9f7396e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.267910 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bca0710d-d2ea-4726-84bb-0bf49d93a63a" (UID: "bca0710d-d2ea-4726-84bb-0bf49d93a63a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.301635 4546 generic.go:334] "Generic (PLEG): container finished" podID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerID="777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4" exitCode=0 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.301802 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.302967 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" event={"ID":"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697","Type":"ContainerDied","Data":"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.303060 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bs98t" event={"ID":"ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697","Type":"ContainerDied","Data":"a2d75f6837aa7f596e840c8d8498bfe80d80474548b073422ec7495b18a9c25f"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.303136 4546 scope.go:117] "RemoveContainer" containerID="777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.309188 4546 generic.go:334] "Generic (PLEG): container finished" podID="47612608-8394-4713-b59a-172469b14bbc" containerID="4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db" exitCode=0 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.309582 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkgtj" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.314279 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerDied","Data":"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.314333 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkgtj" event={"ID":"47612608-8394-4713-b59a-172469b14bbc","Type":"ContainerDied","Data":"70f82743a4aae04861a733f03b0e5e9b0c44cc4b28ec4366fc4869643723e0d3"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.326901 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities\") pod \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\" (UID: \"a4096fe8-44f5-466f-9d1c-9d32a9f7396e\") " Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327531 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327549 4546 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327566 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwwh2\" (UniqueName: \"kubernetes.io/projected/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-kube-api-access-vwwh2\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327579 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327593 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj7dr\" (UniqueName: \"kubernetes.io/projected/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-kube-api-access-lj7dr\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327603 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327612 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bca0710d-d2ea-4726-84bb-0bf49d93a63a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327623 4546 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327635 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdnzz\" (UniqueName: \"kubernetes.io/projected/318d8499-a380-4204-b4ee-15d2692874e3-kube-api-access-bdnzz\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.327646 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxbd5\" (UniqueName: \"kubernetes.io/projected/bca0710d-d2ea-4726-84bb-0bf49d93a63a-kube-api-access-lxbd5\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.329094 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities" (OuterVolumeSpecName: "utilities") pod "a4096fe8-44f5-466f-9d1c-9d32a9f7396e" (UID: "a4096fe8-44f5-466f-9d1c-9d32a9f7396e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.338238 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "318d8499-a380-4204-b4ee-15d2692874e3" (UID: "318d8499-a380-4204-b4ee-15d2692874e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.347791 4546 generic.go:334] "Generic (PLEG): container finished" podID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerID="a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775" exitCode=0 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.347928 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerDied","Data":"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.347973 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kn94x" event={"ID":"bca0710d-d2ea-4726-84bb-0bf49d93a63a","Type":"ContainerDied","Data":"9e092949252ef01552e03f5c1eb750dc5019dee92682572aa56d22588852f39c"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.347933 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kn94x" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.351945 4546 scope.go:117] "RemoveContainer" containerID="777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.352756 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4\": container with ID starting with 777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4 not found: ID does not exist" containerID="777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.352811 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4"} err="failed to get container status \"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4\": rpc error: code = NotFound desc = could not find container \"777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4\": container with ID starting with 777c5eb225deb48443246fa0e7a73a4e2c1e12e999318debb2cacc0b9b51a6a4 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.354107 4546 scope.go:117] "RemoveContainer" containerID="4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.356391 4546 generic.go:334] "Generic (PLEG): container finished" podID="318d8499-a380-4204-b4ee-15d2692874e3" containerID="7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397" exitCode=0 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.356484 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerDied","Data":"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.356510 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fknp6" event={"ID":"318d8499-a380-4204-b4ee-15d2692874e3","Type":"ContainerDied","Data":"1bb4b2e0137fb6c23084990fc45c7c322d8502bf9a590bb508721238815fc0c5"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.357828 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fknp6" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.369248 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerID="68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1" exitCode=0 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.369306 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerDied","Data":"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.369334 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vr449" event={"ID":"a4096fe8-44f5-466f-9d1c-9d32a9f7396e","Type":"ContainerDied","Data":"fed014177d6ae24b836984e9ad0ea594bd73c1b687fcdf6b89cca5109706cc1e"} Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.369479 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vr449" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.373829 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.380878 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fkgtj"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.385927 4546 scope.go:117] "RemoveContainer" containerID="af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.423771 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.429628 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4096fe8-44f5-466f-9d1c-9d32a9f7396e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.429675 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318d8499-a380-4204-b4ee-15d2692874e3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.432171 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2tsqn"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.435908 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bs98t"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.440123 4546 scope.go:117] "RemoveContainer" containerID="d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.440276 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:46:06 crc kubenswrapper[4546]: W0201 06:46:06.444671 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2a4a8d_fc02_4822_979d_5c1f30653add.slice/crio-5d2d8eaac742102b5773fbbb25fcfd62882d1dc1787f3b8450c48c148e3bd581 WatchSource:0}: Error finding container 5d2d8eaac742102b5773fbbb25fcfd62882d1dc1787f3b8450c48c148e3bd581: Status 404 returned error can't find the container with id 5d2d8eaac742102b5773fbbb25fcfd62882d1dc1787f3b8450c48c148e3bd581 Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.444757 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fknp6"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.449115 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.451548 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vr449"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.464703 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.464757 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kn94x"] Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.486031 4546 scope.go:117] "RemoveContainer" containerID="4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.490040 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db\": container with ID starting with 4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db not found: ID does not exist" containerID="4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.490084 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db"} err="failed to get container status \"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db\": rpc error: code = NotFound desc = could not find container \"4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db\": container with ID starting with 4aea24ceda7acb099e3b2df48d508139588d0e3054aac4dc19e94d3a3fadd9db not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.490123 4546 scope.go:117] "RemoveContainer" containerID="af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.490438 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0\": container with ID starting with af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0 not found: ID does not exist" containerID="af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.490471 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0"} err="failed to get container status \"af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0\": rpc error: code = NotFound desc = could not find container \"af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0\": container with ID starting with af78a88338317a6a2083da186e653d4089e8b9cc78338c791cf9141ee62311d0 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.490486 4546 scope.go:117] "RemoveContainer" containerID="d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.491133 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a\": container with ID starting with d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a not found: ID does not exist" containerID="d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.491179 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a"} err="failed to get container status \"d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a\": rpc error: code = NotFound desc = could not find container \"d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a\": container with ID starting with d5cd1129bb2cd57ef8500f2abcedf1c640e642790a126a8846374addde01592a not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.491215 4546 scope.go:117] "RemoveContainer" containerID="a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.506725 4546 scope.go:117] "RemoveContainer" containerID="65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.529699 4546 scope.go:117] "RemoveContainer" containerID="604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.548342 4546 scope.go:117] "RemoveContainer" containerID="a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.549543 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775\": container with ID starting with a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775 not found: ID does not exist" containerID="a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.549622 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775"} err="failed to get container status \"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775\": rpc error: code = NotFound desc = could not find container \"a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775\": container with ID starting with a55f470a71ef3caf8537008a0676565b2ea34672aa73228f1f8ed779cbf2d775 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.549684 4546 scope.go:117] "RemoveContainer" containerID="65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.550175 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a\": container with ID starting with 65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a not found: ID does not exist" containerID="65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.550231 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a"} err="failed to get container status \"65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a\": rpc error: code = NotFound desc = could not find container \"65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a\": container with ID starting with 65944acd4b13cc44d8e84b30d7459482e697c58e92b29ecd9ee5b6ccf0a13f0a not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.550268 4546 scope.go:117] "RemoveContainer" containerID="604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.552611 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974\": container with ID starting with 604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974 not found: ID does not exist" containerID="604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.552650 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974"} err="failed to get container status \"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974\": rpc error: code = NotFound desc = could not find container \"604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974\": container with ID starting with 604a9087da7ad3f70ad19d7ab62adc3eaaba460357eaf389376b3a3dcdfae974 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.552683 4546 scope.go:117] "RemoveContainer" containerID="7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.570015 4546 scope.go:117] "RemoveContainer" containerID="419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.592123 4546 scope.go:117] "RemoveContainer" containerID="8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.605184 4546 scope.go:117] "RemoveContainer" containerID="7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.605634 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397\": container with ID starting with 7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397 not found: ID does not exist" containerID="7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.605678 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397"} err="failed to get container status \"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397\": rpc error: code = NotFound desc = could not find container \"7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397\": container with ID starting with 7773220478fa89a320cc485ee6b921b1ece0647496942923b82989390aa11397 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.605720 4546 scope.go:117] "RemoveContainer" containerID="419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.606660 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca\": container with ID starting with 419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca not found: ID does not exist" containerID="419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.606715 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca"} err="failed to get container status \"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca\": rpc error: code = NotFound desc = could not find container \"419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca\": container with ID starting with 419ad410c43b55d0102058a59170a41794da1f91933b3d2cd75f155ae01509ca not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.606753 4546 scope.go:117] "RemoveContainer" containerID="8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.607884 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a\": container with ID starting with 8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a not found: ID does not exist" containerID="8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.607905 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a"} err="failed to get container status \"8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a\": rpc error: code = NotFound desc = could not find container \"8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a\": container with ID starting with 8f83a2dee18651f615ae3428c2f4831c633f15adc12aa2ba30d11b661cfa229a not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.607926 4546 scope.go:117] "RemoveContainer" containerID="68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.622063 4546 scope.go:117] "RemoveContainer" containerID="4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.640824 4546 scope.go:117] "RemoveContainer" containerID="1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.658104 4546 scope.go:117] "RemoveContainer" containerID="68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.658608 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1\": container with ID starting with 68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1 not found: ID does not exist" containerID="68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.658640 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1"} err="failed to get container status \"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1\": rpc error: code = NotFound desc = could not find container \"68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1\": container with ID starting with 68a9e9a513f40b547afe42ee932207ab327b3291ef8650681fef81ad68b4c6f1 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.658657 4546 scope.go:117] "RemoveContainer" containerID="4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.658958 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3\": container with ID starting with 4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3 not found: ID does not exist" containerID="4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.658986 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3"} err="failed to get container status \"4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3\": rpc error: code = NotFound desc = could not find container \"4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3\": container with ID starting with 4193ed0c8a706cad974ad26861890a8098934b852e282f7ea1e8a7c7066951d3 not found: ID does not exist" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.658999 4546 scope.go:117] "RemoveContainer" containerID="1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0" Feb 01 06:46:06 crc kubenswrapper[4546]: E0201 06:46:06.659475 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0\": container with ID starting with 1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0 not found: ID does not exist" containerID="1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0" Feb 01 06:46:06 crc kubenswrapper[4546]: I0201 06:46:06.659521 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0"} err="failed to get container status \"1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0\": rpc error: code = NotFound desc = could not find container \"1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0\": container with ID starting with 1eb68f8ee30e70585aadc87c1ef52b43879c9ca1ceb1809417608544b4be46f0 not found: ID does not exist" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.377579 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" event={"ID":"dd2a4a8d-fc02-4822-979d-5c1f30653add","Type":"ContainerStarted","Data":"4b7f83d9f6e263fc220861bc81be4e9715d6eb226a2b54565becc9693085b9f8"} Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.377651 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" event={"ID":"dd2a4a8d-fc02-4822-979d-5c1f30653add","Type":"ContainerStarted","Data":"5d2d8eaac742102b5773fbbb25fcfd62882d1dc1787f3b8450c48c148e3bd581"} Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.378034 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.382897 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.399208 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2tsqn" podStartSLOduration=2.399187418 podStartE2EDuration="2.399187418s" podCreationTimestamp="2026-02-01 06:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:46:07.397847906 +0000 UTC m=+198.048783922" watchObservedRunningTime="2026-02-01 06:46:07.399187418 +0000 UTC m=+198.050123424" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.529998 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530284 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530308 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530324 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530331 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530339 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530350 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530361 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530368 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530378 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530385 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530397 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530403 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530410 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530415 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530425 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530431 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530438 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530444 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="extract-utilities" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530451 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530465 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530473 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530478 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="extract-content" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530488 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530494 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: E0201 06:46:07.530504 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530513 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530637 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530656 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530665 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="318d8499-a380-4204-b4ee-15d2692874e3" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530673 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="47612608-8394-4713-b59a-172469b14bbc" containerName="registry-server" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.530710 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" containerName="marketplace-operator" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.531667 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.536341 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.542679 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.542749 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9248\" (UniqueName: \"kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.542949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.546469 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.644164 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.644217 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9248\" (UniqueName: \"kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.644273 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.644703 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.644795 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.661399 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318d8499-a380-4204-b4ee-15d2692874e3" path="/var/lib/kubelet/pods/318d8499-a380-4204-b4ee-15d2692874e3/volumes" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.662069 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47612608-8394-4713-b59a-172469b14bbc" path="/var/lib/kubelet/pods/47612608-8394-4713-b59a-172469b14bbc/volumes" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.663126 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4096fe8-44f5-466f-9d1c-9d32a9f7396e" path="/var/lib/kubelet/pods/a4096fe8-44f5-466f-9d1c-9d32a9f7396e/volumes" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.663775 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9248\" (UniqueName: \"kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248\") pod \"community-operators-lnzzr\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.664391 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca0710d-d2ea-4726-84bb-0bf49d93a63a" path="/var/lib/kubelet/pods/bca0710d-d2ea-4726-84bb-0bf49d93a63a/volumes" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.664947 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697" path="/var/lib/kubelet/pods/ce0b1778-2c61-4a3e-bb62-8d7b2d7bf697/volumes" Feb 01 06:46:07 crc kubenswrapper[4546]: I0201 06:46:07.854129 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.226607 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.396732 4546 generic.go:334] "Generic (PLEG): container finished" podID="075f574e-3456-4491-884c-6893c8b86ca2" containerID="e1c7014f21512c975188aa57f7af8972ae575c351235cbc6a29f200c2f116714" exitCode=0 Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.396824 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerDied","Data":"e1c7014f21512c975188aa57f7af8972ae575c351235cbc6a29f200c2f116714"} Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.396925 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerStarted","Data":"d427021b7b5cba33fec3efa776928931fd31c7aeb81432741c47da5665b45b02"} Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.528619 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mm6ct"] Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.530024 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.537256 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.548789 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm6ct"] Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.559193 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-utilities\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.559250 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvhsx\" (UniqueName: \"kubernetes.io/projected/a314f9ed-17a9-457a-936b-add71a19af31-kube-api-access-fvhsx\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.559697 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-catalog-content\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.661343 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-catalog-content\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.662423 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-utilities\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.662554 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvhsx\" (UniqueName: \"kubernetes.io/projected/a314f9ed-17a9-457a-936b-add71a19af31-kube-api-access-fvhsx\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.662064 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-catalog-content\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.662771 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a314f9ed-17a9-457a-936b-add71a19af31-utilities\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.681197 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvhsx\" (UniqueName: \"kubernetes.io/projected/a314f9ed-17a9-457a-936b-add71a19af31-kube-api-access-fvhsx\") pod \"redhat-marketplace-mm6ct\" (UID: \"a314f9ed-17a9-457a-936b-add71a19af31\") " pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:08 crc kubenswrapper[4546]: I0201 06:46:08.844617 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.252910 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mm6ct"] Feb 01 06:46:09 crc kubenswrapper[4546]: W0201 06:46:09.254180 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda314f9ed_17a9_457a_936b_add71a19af31.slice/crio-23390ee15d381d38b38c65b249f6bc5e986bb3542232c9efefa6b8857df4385e WatchSource:0}: Error finding container 23390ee15d381d38b38c65b249f6bc5e986bb3542232c9efefa6b8857df4385e: Status 404 returned error can't find the container with id 23390ee15d381d38b38c65b249f6bc5e986bb3542232c9efefa6b8857df4385e Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.404630 4546 generic.go:334] "Generic (PLEG): container finished" podID="a314f9ed-17a9-457a-936b-add71a19af31" containerID="5ef8d35fe3af747c804b43badf4a48212ca558f4ad22b06cda5faa865eb97b9a" exitCode=0 Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.404720 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm6ct" event={"ID":"a314f9ed-17a9-457a-936b-add71a19af31","Type":"ContainerDied","Data":"5ef8d35fe3af747c804b43badf4a48212ca558f4ad22b06cda5faa865eb97b9a"} Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.404764 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm6ct" event={"ID":"a314f9ed-17a9-457a-936b-add71a19af31","Type":"ContainerStarted","Data":"23390ee15d381d38b38c65b249f6bc5e986bb3542232c9efefa6b8857df4385e"} Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.411215 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerStarted","Data":"cd8076f7a1375be45031f91c0d201b0a781c3de49488412b77454442083a232d"} Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.938750 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntjgt"] Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.940264 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.942353 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntjgt"] Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.948671 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.981042 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-utilities\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.981092 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2bg\" (UniqueName: \"kubernetes.io/projected/18709a02-cd9b-44ca-b99c-80f2907a26a6-kube-api-access-kb2bg\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:09 crc kubenswrapper[4546]: I0201 06:46:09.981120 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-catalog-content\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.083044 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-utilities\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.083111 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2bg\" (UniqueName: \"kubernetes.io/projected/18709a02-cd9b-44ca-b99c-80f2907a26a6-kube-api-access-kb2bg\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.083148 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-catalog-content\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.083616 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-utilities\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.083673 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18709a02-cd9b-44ca-b99c-80f2907a26a6-catalog-content\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.101896 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2bg\" (UniqueName: \"kubernetes.io/projected/18709a02-cd9b-44ca-b99c-80f2907a26a6-kube-api-access-kb2bg\") pod \"redhat-operators-ntjgt\" (UID: \"18709a02-cd9b-44ca-b99c-80f2907a26a6\") " pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.251446 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.421427 4546 generic.go:334] "Generic (PLEG): container finished" podID="a314f9ed-17a9-457a-936b-add71a19af31" containerID="8d67640345e53c1497e8b2cad2a92352f07bfe9194bd25b6ee0bd0bd58e23839" exitCode=0 Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.421604 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm6ct" event={"ID":"a314f9ed-17a9-457a-936b-add71a19af31","Type":"ContainerDied","Data":"8d67640345e53c1497e8b2cad2a92352f07bfe9194bd25b6ee0bd0bd58e23839"} Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.423477 4546 generic.go:334] "Generic (PLEG): container finished" podID="075f574e-3456-4491-884c-6893c8b86ca2" containerID="cd8076f7a1375be45031f91c0d201b0a781c3de49488412b77454442083a232d" exitCode=0 Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.423508 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerDied","Data":"cd8076f7a1375be45031f91c0d201b0a781c3de49488412b77454442083a232d"} Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.661615 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntjgt"] Feb 01 06:46:10 crc kubenswrapper[4546]: W0201 06:46:10.670914 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18709a02_cd9b_44ca_b99c_80f2907a26a6.slice/crio-a09635b61ebc59eca1223ad77e91238800c330f1e4ee799911fdd19695835bb7 WatchSource:0}: Error finding container a09635b61ebc59eca1223ad77e91238800c330f1e4ee799911fdd19695835bb7: Status 404 returned error can't find the container with id a09635b61ebc59eca1223ad77e91238800c330f1e4ee799911fdd19695835bb7 Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.940399 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5677v"] Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.943830 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.945847 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.951078 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5677v"] Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.994919 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-utilities\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.995054 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv7kr\" (UniqueName: \"kubernetes.io/projected/16c18c35-b36d-419c-ac2c-1007219f143a-kube-api-access-wv7kr\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:10 crc kubenswrapper[4546]: I0201 06:46:10.995182 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-catalog-content\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.096009 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-catalog-content\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.096321 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-utilities\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.096345 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv7kr\" (UniqueName: \"kubernetes.io/projected/16c18c35-b36d-419c-ac2c-1007219f143a-kube-api-access-wv7kr\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.096452 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-catalog-content\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.096699 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c18c35-b36d-419c-ac2c-1007219f143a-utilities\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.114034 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv7kr\" (UniqueName: \"kubernetes.io/projected/16c18c35-b36d-419c-ac2c-1007219f143a-kube-api-access-wv7kr\") pod \"certified-operators-5677v\" (UID: \"16c18c35-b36d-419c-ac2c-1007219f143a\") " pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.257079 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.434790 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerStarted","Data":"d93dd4495b462494bd1f3a6449b44437eb3dcfb6bf0f5ff3f79d62e8c715597b"} Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.446343 4546 generic.go:334] "Generic (PLEG): container finished" podID="18709a02-cd9b-44ca-b99c-80f2907a26a6" containerID="630040f7084fb6c2f754c7107bfc2f94d2a542a1ded8f62528d8f46deb4a2622" exitCode=0 Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.446395 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntjgt" event={"ID":"18709a02-cd9b-44ca-b99c-80f2907a26a6","Type":"ContainerDied","Data":"630040f7084fb6c2f754c7107bfc2f94d2a542a1ded8f62528d8f46deb4a2622"} Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.446417 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntjgt" event={"ID":"18709a02-cd9b-44ca-b99c-80f2907a26a6","Type":"ContainerStarted","Data":"a09635b61ebc59eca1223ad77e91238800c330f1e4ee799911fdd19695835bb7"} Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.456555 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mm6ct" event={"ID":"a314f9ed-17a9-457a-936b-add71a19af31","Type":"ContainerStarted","Data":"981d935ed8a04f07c2296b3c77e7855dd160409d43eeeb84aa640566ad9aadf9"} Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.462752 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnzzr" podStartSLOduration=1.89614093 podStartE2EDuration="4.46273461s" podCreationTimestamp="2026-02-01 06:46:07 +0000 UTC" firstStartedPulling="2026-02-01 06:46:08.39840662 +0000 UTC m=+199.049342635" lastFinishedPulling="2026-02-01 06:46:10.965000298 +0000 UTC m=+201.615936315" observedRunningTime="2026-02-01 06:46:11.460707269 +0000 UTC m=+202.111643286" watchObservedRunningTime="2026-02-01 06:46:11.46273461 +0000 UTC m=+202.113670626" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.480605 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mm6ct" podStartSLOduration=1.90440057 podStartE2EDuration="3.480588193s" podCreationTimestamp="2026-02-01 06:46:08 +0000 UTC" firstStartedPulling="2026-02-01 06:46:09.406501018 +0000 UTC m=+200.057437034" lastFinishedPulling="2026-02-01 06:46:10.98268864 +0000 UTC m=+201.633624657" observedRunningTime="2026-02-01 06:46:11.479849441 +0000 UTC m=+202.130785456" watchObservedRunningTime="2026-02-01 06:46:11.480588193 +0000 UTC m=+202.131524209" Feb 01 06:46:11 crc kubenswrapper[4546]: I0201 06:46:11.682008 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5677v"] Feb 01 06:46:11 crc kubenswrapper[4546]: W0201 06:46:11.692570 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c18c35_b36d_419c_ac2c_1007219f143a.slice/crio-7a64254870177172e17bb0918dfd31e4f30f7bd2a242a6bea474091a0decef64 WatchSource:0}: Error finding container 7a64254870177172e17bb0918dfd31e4f30f7bd2a242a6bea474091a0decef64: Status 404 returned error can't find the container with id 7a64254870177172e17bb0918dfd31e4f30f7bd2a242a6bea474091a0decef64 Feb 01 06:46:12 crc kubenswrapper[4546]: I0201 06:46:12.463060 4546 generic.go:334] "Generic (PLEG): container finished" podID="16c18c35-b36d-419c-ac2c-1007219f143a" containerID="1abaa808874106586bd1d69120b6836f4db4c08f75d5bccd1bfaf66bfd60b52b" exitCode=0 Feb 01 06:46:12 crc kubenswrapper[4546]: I0201 06:46:12.463160 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5677v" event={"ID":"16c18c35-b36d-419c-ac2c-1007219f143a","Type":"ContainerDied","Data":"1abaa808874106586bd1d69120b6836f4db4c08f75d5bccd1bfaf66bfd60b52b"} Feb 01 06:46:12 crc kubenswrapper[4546]: I0201 06:46:12.463672 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5677v" event={"ID":"16c18c35-b36d-419c-ac2c-1007219f143a","Type":"ContainerStarted","Data":"7a64254870177172e17bb0918dfd31e4f30f7bd2a242a6bea474091a0decef64"} Feb 01 06:46:12 crc kubenswrapper[4546]: I0201 06:46:12.468114 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntjgt" event={"ID":"18709a02-cd9b-44ca-b99c-80f2907a26a6","Type":"ContainerStarted","Data":"c84af0dd3cd1da5a8a6d07cb472bdc6a0341273d35e9e9af75c8589f61a6cdb6"} Feb 01 06:46:13 crc kubenswrapper[4546]: I0201 06:46:13.477571 4546 generic.go:334] "Generic (PLEG): container finished" podID="18709a02-cd9b-44ca-b99c-80f2907a26a6" containerID="c84af0dd3cd1da5a8a6d07cb472bdc6a0341273d35e9e9af75c8589f61a6cdb6" exitCode=0 Feb 01 06:46:13 crc kubenswrapper[4546]: I0201 06:46:13.477697 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntjgt" event={"ID":"18709a02-cd9b-44ca-b99c-80f2907a26a6","Type":"ContainerDied","Data":"c84af0dd3cd1da5a8a6d07cb472bdc6a0341273d35e9e9af75c8589f61a6cdb6"} Feb 01 06:46:13 crc kubenswrapper[4546]: I0201 06:46:13.482822 4546 generic.go:334] "Generic (PLEG): container finished" podID="16c18c35-b36d-419c-ac2c-1007219f143a" containerID="c9d76c0d73ca749c10ba3b5d4c0a3ef084aad1c7f44b7b978baf5fd30ee10c94" exitCode=0 Feb 01 06:46:13 crc kubenswrapper[4546]: I0201 06:46:13.482879 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5677v" event={"ID":"16c18c35-b36d-419c-ac2c-1007219f143a","Type":"ContainerDied","Data":"c9d76c0d73ca749c10ba3b5d4c0a3ef084aad1c7f44b7b978baf5fd30ee10c94"} Feb 01 06:46:14 crc kubenswrapper[4546]: I0201 06:46:14.491547 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntjgt" event={"ID":"18709a02-cd9b-44ca-b99c-80f2907a26a6","Type":"ContainerStarted","Data":"20a8ae5f559de389a60b7e143c2d24fc78c97f2bdb0b781cfd3f438c5397003b"} Feb 01 06:46:15 crc kubenswrapper[4546]: I0201 06:46:15.501367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5677v" event={"ID":"16c18c35-b36d-419c-ac2c-1007219f143a","Type":"ContainerStarted","Data":"8732dbf13a391c610bb0b70c9143e93463fd4cdf7b6744ad197d3728c4381fbb"} Feb 01 06:46:15 crc kubenswrapper[4546]: I0201 06:46:15.522625 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5677v" podStartSLOduration=3.985648273 podStartE2EDuration="5.52261109s" podCreationTimestamp="2026-02-01 06:46:10 +0000 UTC" firstStartedPulling="2026-02-01 06:46:12.465171283 +0000 UTC m=+203.116107299" lastFinishedPulling="2026-02-01 06:46:14.0021341 +0000 UTC m=+204.653070116" observedRunningTime="2026-02-01 06:46:15.520723984 +0000 UTC m=+206.171660000" watchObservedRunningTime="2026-02-01 06:46:15.52261109 +0000 UTC m=+206.173547106" Feb 01 06:46:15 crc kubenswrapper[4546]: I0201 06:46:15.523640 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntjgt" podStartSLOduration=3.99972592 podStartE2EDuration="6.523632756s" podCreationTimestamp="2026-02-01 06:46:09 +0000 UTC" firstStartedPulling="2026-02-01 06:46:11.448026262 +0000 UTC m=+202.098962278" lastFinishedPulling="2026-02-01 06:46:13.971933098 +0000 UTC m=+204.622869114" observedRunningTime="2026-02-01 06:46:14.512926868 +0000 UTC m=+205.163862884" watchObservedRunningTime="2026-02-01 06:46:15.523632756 +0000 UTC m=+206.174568771" Feb 01 06:46:17 crc kubenswrapper[4546]: I0201 06:46:17.854368 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:17 crc kubenswrapper[4546]: I0201 06:46:17.855087 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:17 crc kubenswrapper[4546]: I0201 06:46:17.896331 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:18 crc kubenswrapper[4546]: I0201 06:46:18.559253 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 06:46:18 crc kubenswrapper[4546]: I0201 06:46:18.845769 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:18 crc kubenswrapper[4546]: I0201 06:46:18.845836 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:18 crc kubenswrapper[4546]: I0201 06:46:18.880604 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:19 crc kubenswrapper[4546]: I0201 06:46:19.564196 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mm6ct" Feb 01 06:46:20 crc kubenswrapper[4546]: I0201 06:46:20.252599 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:20 crc kubenswrapper[4546]: I0201 06:46:20.252661 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:20 crc kubenswrapper[4546]: I0201 06:46:20.286023 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:20 crc kubenswrapper[4546]: I0201 06:46:20.569008 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntjgt" Feb 01 06:46:21 crc kubenswrapper[4546]: I0201 06:46:21.257974 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:21 crc kubenswrapper[4546]: I0201 06:46:21.258030 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:21 crc kubenswrapper[4546]: I0201 06:46:21.313101 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:21 crc kubenswrapper[4546]: I0201 06:46:21.568741 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5677v" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.667932 4546 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.668410 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72" gracePeriod=15 Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.668446 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4" gracePeriod=15 Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.668494 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92" gracePeriod=15 Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.668446 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6" gracePeriod=15 Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.668638 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f" gracePeriod=15 Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.669767 4546 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670069 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670088 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670099 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670105 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670115 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670124 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670141 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670147 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670155 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670160 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670168 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670173 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: E0201 06:46:22.670180 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670185 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670317 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670329 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670337 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670344 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670351 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.670359 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.674893 4546 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.675877 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684195 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684317 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684400 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684487 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684637 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684714 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684875 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.685296 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.684552 4546 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787334 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787373 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787393 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787421 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787443 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787462 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787489 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787508 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787589 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787623 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787644 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787661 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787678 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787701 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787718 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:22 crc kubenswrapper[4546]: I0201 06:46:22.787737 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.552746 4546 generic.go:334] "Generic (PLEG): container finished" podID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" containerID="d177d8be60be8b9a2382c4ddc95879b7557bdb7c42e14862cce71ff401b15c0b" exitCode=0 Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.552822 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b169bb8d-05fd-433a-ab97-3433c3cb42d3","Type":"ContainerDied","Data":"d177d8be60be8b9a2382c4ddc95879b7557bdb7c42e14862cce71ff401b15c0b"} Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.554833 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.555386 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.556632 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.557422 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92" exitCode=0 Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.557483 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6" exitCode=0 Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.557497 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4" exitCode=0 Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.557504 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f" exitCode=2 Feb 01 06:46:23 crc kubenswrapper[4546]: I0201 06:46:23.557571 4546 scope.go:117] "RemoveContainer" containerID="9e070161490c881e47a1cf5c4e2f2f7b280856d5977e81f309433da0e2e0d5ab" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.138429 4546 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.138998 4546 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.139202 4546 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.139411 4546 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.139594 4546 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:24 crc kubenswrapper[4546]: I0201 06:46:24.139621 4546 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.139766 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="200ms" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.341100 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="400ms" Feb 01 06:46:24 crc kubenswrapper[4546]: I0201 06:46:24.567222 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:46:24 crc kubenswrapper[4546]: E0201 06:46:24.741742 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="800ms" Feb 01 06:46:24 crc kubenswrapper[4546]: I0201 06:46:24.854827 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:46:24 crc kubenswrapper[4546]: I0201 06:46:24.855449 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.024837 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock\") pod \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.024954 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock" (OuterVolumeSpecName: "var-lock") pod "b169bb8d-05fd-433a-ab97-3433c3cb42d3" (UID: "b169bb8d-05fd-433a-ab97-3433c3cb42d3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.024977 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access\") pod \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.025028 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir\") pod \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\" (UID: \"b169bb8d-05fd-433a-ab97-3433c3cb42d3\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.025143 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b169bb8d-05fd-433a-ab97-3433c3cb42d3" (UID: "b169bb8d-05fd-433a-ab97-3433c3cb42d3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.025307 4546 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.025327 4546 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.032965 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b169bb8d-05fd-433a-ab97-3433c3cb42d3" (UID: "b169bb8d-05fd-433a-ab97-3433c3cb42d3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.109755 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.110458 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.110926 4546 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.111283 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.125983 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b169bb8d-05fd-433a-ab97-3433c3cb42d3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.226915 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228024 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.227015 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228069 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228145 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228227 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228537 4546 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228563 4546 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.228574 4546 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.421221 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.421274 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.421337 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.421757 4546 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-dwtsx.18900c6f3aa44bd7\": dial tcp 192.168.26.196:6443: connect: connection refused" event=< Feb 01 06:46:25 crc kubenswrapper[4546]: &Event{ObjectMeta:{machine-config-daemon-dwtsx.18900c6f3aa44bd7 openshift-machine-config-operator 29490 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-dwtsx,UID:a4316448-1833-40f9-bdd7-e13d7dd4da6b,APIVersion:v1,ResourceVersion:26444,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Feb 01 06:46:25 crc kubenswrapper[4546]: body: Feb 01 06:46:25 crc kubenswrapper[4546]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:45:25 +0000 UTC,LastTimestamp:2026-02-01 06:46:25.421253949 +0000 UTC m=+216.072189966,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 01 06:46:25 crc kubenswrapper[4546]: > Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.422181 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.422235 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf" gracePeriod=600 Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.543342 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="1.6s" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.575463 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf" exitCode=0 Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.575541 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf"} Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.576653 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b169bb8d-05fd-433a-ab97-3433c3cb42d3","Type":"ContainerDied","Data":"f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58"} Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.576676 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92829cfd65e1f49c9b8ff24d7f5e690d17de5b8614ac34718c00e999c970c58" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.576787 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.578730 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.579337 4546 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72" exitCode=0 Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.579388 4546 scope.go:117] "RemoveContainer" containerID="4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.579425 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.596511 4546 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.596682 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.597702 4546 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.597934 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.601975 4546 scope.go:117] "RemoveContainer" containerID="f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.622804 4546 scope.go:117] "RemoveContainer" containerID="8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.636342 4546 scope.go:117] "RemoveContainer" containerID="8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.651912 4546 scope.go:117] "RemoveContainer" containerID="5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.661740 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.672797 4546 scope.go:117] "RemoveContainer" containerID="4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.700753 4546 scope.go:117] "RemoveContainer" containerID="4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.707063 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\": container with ID starting with 4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92 not found: ID does not exist" containerID="4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.707131 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92"} err="failed to get container status \"4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\": rpc error: code = NotFound desc = could not find container \"4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92\": container with ID starting with 4b54af13da11ab6f04d206a4da00a6e7167830215bd4bf32bc70080f5ea6bb92 not found: ID does not exist" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.707167 4546 scope.go:117] "RemoveContainer" containerID="f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.709439 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\": container with ID starting with f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6 not found: ID does not exist" containerID="f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.709573 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6"} err="failed to get container status \"f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\": rpc error: code = NotFound desc = could not find container \"f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6\": container with ID starting with f3b28d56e9690f83ff2eb368be9fdca26dd83c71b69ac4d7cc08d4b7ff5dd8b6 not found: ID does not exist" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.709679 4546 scope.go:117] "RemoveContainer" containerID="8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.710580 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\": container with ID starting with 8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4 not found: ID does not exist" containerID="8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.710630 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4"} err="failed to get container status \"8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\": rpc error: code = NotFound desc = could not find container \"8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4\": container with ID starting with 8688291eea241e5140dece9030fb282f1794196b889237e49bcdc10d6b6a62a4 not found: ID does not exist" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.710664 4546 scope.go:117] "RemoveContainer" containerID="8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.711391 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\": container with ID starting with 8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f not found: ID does not exist" containerID="8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.711592 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f"} err="failed to get container status \"8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\": rpc error: code = NotFound desc = could not find container \"8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f\": container with ID starting with 8b35ccbc9983b61b50eb2cdbc82bdf5e9d4535979d54aeac94ba08cacd06048f not found: ID does not exist" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.711848 4546 scope.go:117] "RemoveContainer" containerID="5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.712679 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\": container with ID starting with 5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72 not found: ID does not exist" containerID="5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.712720 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72"} err="failed to get container status \"5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\": rpc error: code = NotFound desc = could not find container \"5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72\": container with ID starting with 5e4bd2618e77bd265e7e6a3c353abdf5259ffc0affc5cf95e8aa44f52099da72 not found: ID does not exist" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.712746 4546 scope.go:117] "RemoveContainer" containerID="4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd" Feb 01 06:46:25 crc kubenswrapper[4546]: E0201 06:46:25.713107 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\": container with ID starting with 4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd not found: ID does not exist" containerID="4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd" Feb 01 06:46:25 crc kubenswrapper[4546]: I0201 06:46:25.713143 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd"} err="failed to get container status \"4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\": rpc error: code = NotFound desc = could not find container \"4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd\": container with ID starting with 4b05aa7b8a73bf61a7865879bfd2726186231cedb20616b9290643651043a7bd not found: ID does not exist" Feb 01 06:46:26 crc kubenswrapper[4546]: I0201 06:46:26.590959 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3"} Feb 01 06:46:26 crc kubenswrapper[4546]: I0201 06:46:26.591616 4546 status_manager.go:851] "Failed to get status for pod" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dwtsx\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:26 crc kubenswrapper[4546]: I0201 06:46:26.592026 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:27 crc kubenswrapper[4546]: E0201 06:46:27.144645 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="3.2s" Feb 01 06:46:27 crc kubenswrapper[4546]: E0201 06:46:27.696271 4546 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-dwtsx.18900c6f3aa44bd7\": dial tcp 192.168.26.196:6443: connect: connection refused" event=< Feb 01 06:46:27 crc kubenswrapper[4546]: &Event{ObjectMeta:{machine-config-daemon-dwtsx.18900c6f3aa44bd7 openshift-machine-config-operator 29490 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-dwtsx,UID:a4316448-1833-40f9-bdd7-e13d7dd4da6b,APIVersion:v1,ResourceVersion:26444,FieldPath:spec.containers{machine-config-daemon},},Reason:ProbeError,Message:Liveness probe error: Get "http://127.0.0.1:8798/health": dial tcp 127.0.0.1:8798: connect: connection refused Feb 01 06:46:27 crc kubenswrapper[4546]: body: Feb 01 06:46:27 crc kubenswrapper[4546]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:45:25 +0000 UTC,LastTimestamp:2026-02-01 06:46:25.421253949 +0000 UTC m=+216.072189966,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 01 06:46:27 crc kubenswrapper[4546]: > Feb 01 06:46:27 crc kubenswrapper[4546]: E0201 06:46:27.711205 4546 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:27 crc kubenswrapper[4546]: I0201 06:46:27.711619 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:27 crc kubenswrapper[4546]: W0201 06:46:27.734051 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-adc3fde79bbbfdeaf6201ba0b5ee485977afb9c87bcf557a6febb864dbd66635 WatchSource:0}: Error finding container adc3fde79bbbfdeaf6201ba0b5ee485977afb9c87bcf557a6febb864dbd66635: Status 404 returned error can't find the container with id adc3fde79bbbfdeaf6201ba0b5ee485977afb9c87bcf557a6febb864dbd66635 Feb 01 06:46:28 crc kubenswrapper[4546]: I0201 06:46:28.603361 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7570c2a9137667d7467afbc1a69304d18f8c7dddeac9ee3e57c1908140840ab4"} Feb 01 06:46:28 crc kubenswrapper[4546]: I0201 06:46:28.603708 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"adc3fde79bbbfdeaf6201ba0b5ee485977afb9c87bcf557a6febb864dbd66635"} Feb 01 06:46:28 crc kubenswrapper[4546]: E0201 06:46:28.604266 4546 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.26.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:46:28 crc kubenswrapper[4546]: I0201 06:46:28.604450 4546 status_manager.go:851] "Failed to get status for pod" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dwtsx\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:28 crc kubenswrapper[4546]: I0201 06:46:28.604701 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:29 crc kubenswrapper[4546]: I0201 06:46:29.657320 4546 status_manager.go:851] "Failed to get status for pod" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dwtsx\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:29 crc kubenswrapper[4546]: I0201 06:46:29.657607 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:30 crc kubenswrapper[4546]: E0201 06:46:30.345943 4546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.196:6443: connect: connection refused" interval="6.4s" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.654845 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.656458 4546 status_manager.go:851] "Failed to get status for pod" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dwtsx\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.656949 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.671983 4546 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.672023 4546 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:33 crc kubenswrapper[4546]: E0201 06:46:33.672456 4546 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:33 crc kubenswrapper[4546]: I0201 06:46:33.673501 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:33 crc kubenswrapper[4546]: W0201 06:46:33.699425 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6894f933fdfd79b18837ddc5925146cd4fbd837aa5052c90f513622e95443695 WatchSource:0}: Error finding container 6894f933fdfd79b18837ddc5925146cd4fbd837aa5052c90f513622e95443695: Status 404 returned error can't find the container with id 6894f933fdfd79b18837ddc5925146cd4fbd837aa5052c90f513622e95443695 Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.647630 4546 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="494709bfa8e72f4a05d50ff9a93ea78e678b68a190391f0063050efcefaac8c0" exitCode=0 Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.647677 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"494709bfa8e72f4a05d50ff9a93ea78e678b68a190391f0063050efcefaac8c0"} Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.647705 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6894f933fdfd79b18837ddc5925146cd4fbd837aa5052c90f513622e95443695"} Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.647974 4546 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.647988 4546 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:34 crc kubenswrapper[4546]: E0201 06:46:34.648632 4546 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.648685 4546 status_manager.go:851] "Failed to get status for pod" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dwtsx\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:34 crc kubenswrapper[4546]: I0201 06:46:34.650588 4546 status_manager.go:851] "Failed to get status for pod" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.26.196:6443: connect: connection refused" Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.661783 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0635593d605f52ddeca0e84609aae97d6abd9032db4eae0fc71c1444049d10e7"} Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662188 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"463b8a8d60dfb4ef76e9e821faedb7640b9080ad64357dc2f088e4110b6b7181"} Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662285 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6141f696bcf8298de3cf9adaeb3e08fbe2e3f365d35707e51a5bd401ea3f68b0"} Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662312 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"be1558eea055f23ef1347f5b0cd3c6e250728ade2f8c19dcbd2aaebe1a2d9e55"} Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662325 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"be08e2796defe36ae2c5f329eb435e86b6db52f50a7ab4e1aaf937bffd550a6b"} Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662450 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662566 4546 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:35 crc kubenswrapper[4546]: I0201 06:46:35.662593 4546 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:37 crc kubenswrapper[4546]: I0201 06:46:37.672957 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 06:46:37 crc kubenswrapper[4546]: I0201 06:46:37.673986 4546 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a" exitCode=1 Feb 01 06:46:37 crc kubenswrapper[4546]: I0201 06:46:37.674023 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a"} Feb 01 06:46:37 crc kubenswrapper[4546]: I0201 06:46:37.674379 4546 scope.go:117] "RemoveContainer" containerID="8de65c99c99422bbe63f76c4ddbe6523e30b40c21232c08ea19852fdee90f65a" Feb 01 06:46:38 crc kubenswrapper[4546]: I0201 06:46:38.673750 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:38 crc kubenswrapper[4546]: I0201 06:46:38.674156 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:38 crc kubenswrapper[4546]: I0201 06:46:38.694031 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 06:46:38 crc kubenswrapper[4546]: I0201 06:46:38.694090 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3aa8425d95c33d892879549468aa4ba3e6ebe8229a86b6bd4b8a68dc70ec0f9c"} Feb 01 06:46:38 crc kubenswrapper[4546]: I0201 06:46:38.695373 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:39 crc kubenswrapper[4546]: I0201 06:46:39.911583 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.056783 4546 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.164773 4546 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f757a192-f39a-4aef-ae80-1a220f227845" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.710144 4546 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.710186 4546 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.712843 4546 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f757a192-f39a-4aef-ae80-1a220f227845" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.715247 4546 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://be08e2796defe36ae2c5f329eb435e86b6db52f50a7ab4e1aaf937bffd550a6b" Feb 01 06:46:41 crc kubenswrapper[4546]: I0201 06:46:41.715278 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:42 crc kubenswrapper[4546]: I0201 06:46:42.038096 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:46:42 crc kubenswrapper[4546]: I0201 06:46:42.041188 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:46:42 crc kubenswrapper[4546]: I0201 06:46:42.716040 4546 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:42 crc kubenswrapper[4546]: I0201 06:46:42.716339 4546 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="89c8ff88-ae22-40a1-b11d-8288582e08c0" Feb 01 06:46:42 crc kubenswrapper[4546]: I0201 06:46:42.718741 4546 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f757a192-f39a-4aef-ae80-1a220f227845" Feb 01 06:46:49 crc kubenswrapper[4546]: I0201 06:46:49.810584 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 06:46:49 crc kubenswrapper[4546]: I0201 06:46:49.922254 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:46:50 crc kubenswrapper[4546]: I0201 06:46:50.357656 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:46:50 crc kubenswrapper[4546]: I0201 06:46:50.480993 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 06:46:51 crc kubenswrapper[4546]: I0201 06:46:51.175723 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 06:46:51 crc kubenswrapper[4546]: I0201 06:46:51.233564 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 06:46:51 crc kubenswrapper[4546]: I0201 06:46:51.529631 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 06:46:51 crc kubenswrapper[4546]: I0201 06:46:51.902770 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 06:46:51 crc kubenswrapper[4546]: I0201 06:46:51.937473 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:46:52 crc kubenswrapper[4546]: I0201 06:46:52.970823 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 06:46:53 crc kubenswrapper[4546]: I0201 06:46:53.017090 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 06:46:53 crc kubenswrapper[4546]: I0201 06:46:53.185924 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 06:46:53 crc kubenswrapper[4546]: I0201 06:46:53.225413 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 06:46:53 crc kubenswrapper[4546]: I0201 06:46:53.415669 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 06:46:53 crc kubenswrapper[4546]: I0201 06:46:53.420271 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.077467 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.166955 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.297608 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.416282 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.457919 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.588786 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.613836 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.651525 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.794128 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 06:46:54 crc kubenswrapper[4546]: I0201 06:46:54.994350 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.050526 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.189541 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.293850 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.635287 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.638956 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.871763 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.898748 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.923845 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.961603 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 06:46:55 crc kubenswrapper[4546]: I0201 06:46:55.969324 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.020376 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.253549 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.270467 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.319481 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.354582 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.380018 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.605661 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.632343 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.719301 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.724243 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.752061 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 06:46:56 crc kubenswrapper[4546]: I0201 06:46:56.766390 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.000699 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.002529 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.050709 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.059660 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.141148 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.159164 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.193888 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.271155 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.284018 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.319885 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.334966 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.363231 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.462540 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.524946 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.530662 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.582255 4546 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.586200 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.586320 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.590347 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.598926 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.598911145 podStartE2EDuration="16.598911145s" podCreationTimestamp="2026-02-01 06:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:46:57.597778641 +0000 UTC m=+248.248714656" watchObservedRunningTime="2026-02-01 06:46:57.598911145 +0000 UTC m=+248.249847162" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.609305 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.663775 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.827934 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.902996 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 06:46:57 crc kubenswrapper[4546]: I0201 06:46:57.929990 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.110808 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.151084 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.219323 4546 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.287512 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.311106 4546 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.326000 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.334280 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.382989 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.464891 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.496486 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.573763 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.622594 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.642321 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.655447 4546 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.678412 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.713164 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.914151 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.917790 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 06:46:58 crc kubenswrapper[4546]: I0201 06:46:58.984999 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.037157 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.100852 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.131218 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.148234 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.213232 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.326026 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.378923 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.441764 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.474777 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.572465 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.576360 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.582178 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.621434 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.710978 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.742134 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.992728 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 06:46:59 crc kubenswrapper[4546]: I0201 06:46:59.994643 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.012074 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.072841 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.159319 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.177623 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.215601 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.262972 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.295829 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.345305 4546 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.449715 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.545873 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.587103 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.674541 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.678621 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.751735 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.781066 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.806103 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.866902 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 06:47:00 crc kubenswrapper[4546]: I0201 06:47:00.999199 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.018121 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.105768 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.126793 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.265141 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.322336 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.390839 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.533262 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.538687 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.592034 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.601693 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.641765 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.705306 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.705617 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.766411 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.767539 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.783197 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.827740 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.831384 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.858465 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.867378 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.933192 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 06:47:01 crc kubenswrapper[4546]: I0201 06:47:01.981830 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.158773 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.301206 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.395281 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.475718 4546 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.475926 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7570c2a9137667d7467afbc1a69304d18f8c7dddeac9ee3e57c1908140840ab4" gracePeriod=5 Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.643833 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.742743 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.799276 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.801175 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.869174 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.869370 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 06:47:02 crc kubenswrapper[4546]: I0201 06:47:02.869973 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.101120 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.213426 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.291807 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.362563 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.444244 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.466237 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.516724 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.560760 4546 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.569674 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.614904 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.652684 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.698736 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.867507 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.908152 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 06:47:03 crc kubenswrapper[4546]: I0201 06:47:03.954157 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.000549 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.044679 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.127751 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.251726 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.257734 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.319694 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.352589 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.482058 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.582711 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.609181 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.635088 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.659273 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.703701 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.736690 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.746411 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.802945 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.919357 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.941404 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 06:47:04 crc kubenswrapper[4546]: I0201 06:47:04.984034 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.241900 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.248578 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.253368 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.360263 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.401095 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.407961 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.436874 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.457489 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.467054 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.474589 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.526882 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.557754 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.597284 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.601064 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.637924 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.695028 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.723759 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.807702 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.894732 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 06:47:05 crc kubenswrapper[4546]: I0201 06:47:05.936748 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.129469 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.227001 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.385654 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.538535 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.691680 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.711574 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.782143 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.787891 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.797134 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.815837 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.816434 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.818060 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.866289 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 06:47:06 crc kubenswrapper[4546]: I0201 06:47:06.885065 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.016416 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.199677 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.255198 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.261418 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.405746 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.478184 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.496959 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.544113 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.559443 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.559776 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.587096 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.833634 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.833685 4546 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7570c2a9137667d7467afbc1a69304d18f8c7dddeac9ee3e57c1908140840ab4" exitCode=137 Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.920666 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.923220 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 06:47:07 crc kubenswrapper[4546]: I0201 06:47:07.945714 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.030925 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.030982 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123599 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123641 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123672 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123699 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123702 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123716 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123703 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123746 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123823 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123848 4546 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123874 4546 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.123883 4546 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.130706 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.156799 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.199220 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.225171 4546 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.225306 4546 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.252073 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.320240 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.364218 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.373135 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.426450 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.550376 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.744484 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.838475 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.838534 4546 scope.go:117] "RemoveContainer" containerID="7570c2a9137667d7467afbc1a69304d18f8c7dddeac9ee3e57c1908140840ab4" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.838614 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:47:08 crc kubenswrapper[4546]: I0201 06:47:08.866703 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 06:47:09 crc kubenswrapper[4546]: I0201 06:47:09.461930 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 06:47:09 crc kubenswrapper[4546]: I0201 06:47:09.623406 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 06:47:09 crc kubenswrapper[4546]: I0201 06:47:09.634157 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 06:47:09 crc kubenswrapper[4546]: I0201 06:47:09.660116 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 01 06:47:09 crc kubenswrapper[4546]: I0201 06:47:09.803613 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 06:47:10 crc kubenswrapper[4546]: I0201 06:47:10.121613 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 06:47:10 crc kubenswrapper[4546]: I0201 06:47:10.946449 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 06:47:11 crc kubenswrapper[4546]: I0201 06:47:10.976055 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.503959 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.504454 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" podUID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" containerName="route-controller-manager" containerID="cri-o://e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb" gracePeriod=30 Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.633530 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.633716 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerName="controller-manager" containerID="cri-o://f46ca2c820fea1a0e0e140b147ce39ae6e363572f3f4fbb2313c85e07531e5da" gracePeriod=30 Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.846614 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.854525 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert\") pod \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.854584 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkwx8\" (UniqueName: \"kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8\") pod \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.854626 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca\") pod \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.854709 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config\") pod \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\" (UID: \"b3051be4-3bf1-4a18-8636-ed39c3a4c479\") " Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.855193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca" (OuterVolumeSpecName: "client-ca") pod "b3051be4-3bf1-4a18-8636-ed39c3a4c479" (UID: "b3051be4-3bf1-4a18-8636-ed39c3a4c479"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.855416 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config" (OuterVolumeSpecName: "config") pod "b3051be4-3bf1-4a18-8636-ed39c3a4c479" (UID: "b3051be4-3bf1-4a18-8636-ed39c3a4c479"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.868160 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8" (OuterVolumeSpecName: "kube-api-access-vkwx8") pod "b3051be4-3bf1-4a18-8636-ed39c3a4c479" (UID: "b3051be4-3bf1-4a18-8636-ed39c3a4c479"). InnerVolumeSpecName "kube-api-access-vkwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.868183 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b3051be4-3bf1-4a18-8636-ed39c3a4c479" (UID: "b3051be4-3bf1-4a18-8636-ed39c3a4c479"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.933776 4546 generic.go:334] "Generic (PLEG): container finished" podID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" containerID="e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb" exitCode=0 Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.933832 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" event={"ID":"b3051be4-3bf1-4a18-8636-ed39c3a4c479","Type":"ContainerDied","Data":"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb"} Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.933872 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" event={"ID":"b3051be4-3bf1-4a18-8636-ed39c3a4c479","Type":"ContainerDied","Data":"0c54578d2a054ee4e62b5d1d672985470c1a5bce58dd26f57b79aee91b03bdc7"} Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.933888 4546 scope.go:117] "RemoveContainer" containerID="e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.933977 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.938628 4546 generic.go:334] "Generic (PLEG): container finished" podID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerID="f46ca2c820fea1a0e0e140b147ce39ae6e363572f3f4fbb2313c85e07531e5da" exitCode=0 Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.938654 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" event={"ID":"9760ca7f-b330-4ab0-ae37-57c150826f20","Type":"ContainerDied","Data":"f46ca2c820fea1a0e0e140b147ce39ae6e363572f3f4fbb2313c85e07531e5da"} Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.956027 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.956052 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3051be4-3bf1-4a18-8636-ed39c3a4c479-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.956084 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkwx8\" (UniqueName: \"kubernetes.io/projected/b3051be4-3bf1-4a18-8636-ed39c3a4c479-kube-api-access-vkwx8\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.956092 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3051be4-3bf1-4a18-8636-ed39c3a4c479-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.960424 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.962435 4546 scope.go:117] "RemoveContainer" containerID="e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb" Feb 01 06:47:29 crc kubenswrapper[4546]: E0201 06:47:29.962771 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb\": container with ID starting with e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb not found: ID does not exist" containerID="e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.962801 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb"} err="failed to get container status \"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb\": rpc error: code = NotFound desc = could not find container \"e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb\": container with ID starting with e0bcaa16d16cd28b60d88124df2993dd249f42bff38259c59f6a122a731b74fb not found: ID does not exist" Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.962964 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntrd2"] Feb 01 06:47:29 crc kubenswrapper[4546]: I0201 06:47:29.975140 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057020 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert\") pod \"9760ca7f-b330-4ab0-ae37-57c150826f20\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057055 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gk7\" (UniqueName: \"kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7\") pod \"9760ca7f-b330-4ab0-ae37-57c150826f20\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057078 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config\") pod \"9760ca7f-b330-4ab0-ae37-57c150826f20\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057103 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles\") pod \"9760ca7f-b330-4ab0-ae37-57c150826f20\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057124 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca\") pod \"9760ca7f-b330-4ab0-ae37-57c150826f20\" (UID: \"9760ca7f-b330-4ab0-ae37-57c150826f20\") " Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.057752 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca" (OuterVolumeSpecName: "client-ca") pod "9760ca7f-b330-4ab0-ae37-57c150826f20" (UID: "9760ca7f-b330-4ab0-ae37-57c150826f20"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.058148 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9760ca7f-b330-4ab0-ae37-57c150826f20" (UID: "9760ca7f-b330-4ab0-ae37-57c150826f20"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.058451 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config" (OuterVolumeSpecName: "config") pod "9760ca7f-b330-4ab0-ae37-57c150826f20" (UID: "9760ca7f-b330-4ab0-ae37-57c150826f20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.060667 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9760ca7f-b330-4ab0-ae37-57c150826f20" (UID: "9760ca7f-b330-4ab0-ae37-57c150826f20"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.060676 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7" (OuterVolumeSpecName: "kube-api-access-28gk7") pod "9760ca7f-b330-4ab0-ae37-57c150826f20" (UID: "9760ca7f-b330-4ab0-ae37-57c150826f20"). InnerVolumeSpecName "kube-api-access-28gk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.158727 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9760ca7f-b330-4ab0-ae37-57c150826f20-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.158815 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gk7\" (UniqueName: \"kubernetes.io/projected/9760ca7f-b330-4ab0-ae37-57c150826f20-kube-api-access-28gk7\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.158901 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.158952 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.158996 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9760ca7f-b330-4ab0-ae37-57c150826f20-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.944931 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" event={"ID":"9760ca7f-b330-4ab0-ae37-57c150826f20","Type":"ContainerDied","Data":"b3bccaadc65add06df497eaa46a9832f5282386d60ebcf7298fbc1a0b4d607c2"} Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.944969 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gzcwd" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.945136 4546 scope.go:117] "RemoveContainer" containerID="f46ca2c820fea1a0e0e140b147ce39ae6e363572f3f4fbb2313c85e07531e5da" Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.969272 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:47:30 crc kubenswrapper[4546]: I0201 06:47:30.975632 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gzcwd"] Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.659032 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" path="/var/lib/kubelet/pods/9760ca7f-b330-4ab0-ae37-57c150826f20/volumes" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.659489 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" path="/var/lib/kubelet/pods/b3051be4-3bf1-4a18-8636-ed39c3a4c479/volumes" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.682655 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:31 crc kubenswrapper[4546]: E0201 06:47:31.682847 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerName="controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.682877 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerName="controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: E0201 06:47:31.682891 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" containerName="route-controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.682897 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" containerName="route-controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: E0201 06:47:31.682911 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.682917 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:47:31 crc kubenswrapper[4546]: E0201 06:47:31.682925 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" containerName="installer" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.682933 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" containerName="installer" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.683008 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3051be4-3bf1-4a18-8636-ed39c3a4c479" containerName="route-controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.683016 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.683024 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9760ca7f-b330-4ab0-ae37-57c150826f20" containerName="controller-manager" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.683034 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b169bb8d-05fd-433a-ab97-3433c3cb42d3" containerName="installer" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.683355 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.684687 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.685999 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.686549 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.687283 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.687379 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.687507 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.687578 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.688071 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.689821 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.689972 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.690660 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.691926 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.692132 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.692943 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.693427 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.693525 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.696486 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.773852 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.773926 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9m6\" (UniqueName: \"kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.773961 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774004 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774024 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774105 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774129 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774179 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.774210 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbq4n\" (UniqueName: \"kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875729 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875767 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875795 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875813 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbq4n\" (UniqueName: \"kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875845 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875887 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9m6\" (UniqueName: \"kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875915 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875943 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.875961 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.877795 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.877947 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.878146 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.878519 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.878636 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.881597 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.881596 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.889565 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbq4n\" (UniqueName: \"kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n\") pod \"controller-manager-575858cbff-mp49n\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.889606 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9m6\" (UniqueName: \"kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6\") pod \"route-controller-manager-775b9f8b87-6q4xv\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:31 crc kubenswrapper[4546]: I0201 06:47:31.998147 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.009282 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.149064 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.376940 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:32 crc kubenswrapper[4546]: W0201 06:47:32.381661 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode577eb4b_1ab6_469f_988e_556ac06e28d3.slice/crio-005773ba0c2fdad12dc10641bf1e9f883265b7252bb00658cd9241ff92e26eca WatchSource:0}: Error finding container 005773ba0c2fdad12dc10641bf1e9f883265b7252bb00658cd9241ff92e26eca: Status 404 returned error can't find the container with id 005773ba0c2fdad12dc10641bf1e9f883265b7252bb00658cd9241ff92e26eca Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.965072 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" event={"ID":"ae8afcca-6de2-42ca-b04c-d55acd3269d4","Type":"ContainerStarted","Data":"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483"} Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.965277 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.965288 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" event={"ID":"ae8afcca-6de2-42ca-b04c-d55acd3269d4","Type":"ContainerStarted","Data":"feb8747bdfef856aec9ac81ee3a739ccaf024107302d1e411eed34c5ca297322"} Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.970368 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.970650 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" event={"ID":"e577eb4b-1ab6-469f-988e-556ac06e28d3","Type":"ContainerStarted","Data":"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6"} Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.970687 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" event={"ID":"e577eb4b-1ab6-469f-988e-556ac06e28d3","Type":"ContainerStarted","Data":"005773ba0c2fdad12dc10641bf1e9f883265b7252bb00658cd9241ff92e26eca"} Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.970849 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.974021 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.976368 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" podStartSLOduration=3.9763591 podStartE2EDuration="3.9763591s" podCreationTimestamp="2026-02-01 06:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:47:32.974818743 +0000 UTC m=+283.625754760" watchObservedRunningTime="2026-02-01 06:47:32.9763591 +0000 UTC m=+283.627295115" Feb 01 06:47:32 crc kubenswrapper[4546]: I0201 06:47:32.989519 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" podStartSLOduration=3.989510709 podStartE2EDuration="3.989510709s" podCreationTimestamp="2026-02-01 06:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:47:32.985734325 +0000 UTC m=+283.636670341" watchObservedRunningTime="2026-02-01 06:47:32.989510709 +0000 UTC m=+283.640446724" Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.505592 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.506302 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" podUID="e577eb4b-1ab6-469f-988e-556ac06e28d3" containerName="controller-manager" containerID="cri-o://80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6" gracePeriod=30 Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.515061 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.515318 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" podUID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" containerName="route-controller-manager" containerID="cri-o://6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483" gracePeriod=30 Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.553409 4546 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 01 06:47:49 crc kubenswrapper[4546]: I0201 06:47:49.959176 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.038084 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.059094 4546 generic.go:334] "Generic (PLEG): container finished" podID="e577eb4b-1ab6-469f-988e-556ac06e28d3" containerID="80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6" exitCode=0 Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.059159 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" event={"ID":"e577eb4b-1ab6-469f-988e-556ac06e28d3","Type":"ContainerDied","Data":"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6"} Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.059188 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" event={"ID":"e577eb4b-1ab6-469f-988e-556ac06e28d3","Type":"ContainerDied","Data":"005773ba0c2fdad12dc10641bf1e9f883265b7252bb00658cd9241ff92e26eca"} Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.059209 4546 scope.go:117] "RemoveContainer" containerID="80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.059228 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-575858cbff-mp49n" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.062802 4546 generic.go:334] "Generic (PLEG): container finished" podID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" containerID="6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483" exitCode=0 Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.062848 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" event={"ID":"ae8afcca-6de2-42ca-b04c-d55acd3269d4","Type":"ContainerDied","Data":"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483"} Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.062892 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" event={"ID":"ae8afcca-6de2-42ca-b04c-d55acd3269d4","Type":"ContainerDied","Data":"feb8747bdfef856aec9ac81ee3a739ccaf024107302d1e411eed34c5ca297322"} Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.062954 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.077791 4546 scope.go:117] "RemoveContainer" containerID="80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6" Feb 01 06:47:50 crc kubenswrapper[4546]: E0201 06:47:50.078146 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6\": container with ID starting with 80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6 not found: ID does not exist" containerID="80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.078176 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6"} err="failed to get container status \"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6\": rpc error: code = NotFound desc = could not find container \"80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6\": container with ID starting with 80f61d18f5b74883da37a8c5ff98f014779278bdbcb4b59caa54b874869bdfc6 not found: ID does not exist" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.078196 4546 scope.go:117] "RemoveContainer" containerID="6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.096304 4546 scope.go:117] "RemoveContainer" containerID="6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483" Feb 01 06:47:50 crc kubenswrapper[4546]: E0201 06:47:50.096610 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483\": container with ID starting with 6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483 not found: ID does not exist" containerID="6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.096637 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483"} err="failed to get container status \"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483\": rpc error: code = NotFound desc = could not find container \"6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483\": container with ID starting with 6ee98031c20714c80290806215974833a7c9370cc1bfa24d3ecad5310db02483 not found: ID does not exist" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.112553 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert\") pod \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.112639 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config\") pod \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.112698 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca\") pod \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.112747 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9m6\" (UniqueName: \"kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6\") pod \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\" (UID: \"ae8afcca-6de2-42ca-b04c-d55acd3269d4\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.114187 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config" (OuterVolumeSpecName: "config") pod "ae8afcca-6de2-42ca-b04c-d55acd3269d4" (UID: "ae8afcca-6de2-42ca-b04c-d55acd3269d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.116009 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae8afcca-6de2-42ca-b04c-d55acd3269d4" (UID: "ae8afcca-6de2-42ca-b04c-d55acd3269d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.120692 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae8afcca-6de2-42ca-b04c-d55acd3269d4" (UID: "ae8afcca-6de2-42ca-b04c-d55acd3269d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.124984 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6" (OuterVolumeSpecName: "kube-api-access-4h9m6") pod "ae8afcca-6de2-42ca-b04c-d55acd3269d4" (UID: "ae8afcca-6de2-42ca-b04c-d55acd3269d4"). InnerVolumeSpecName "kube-api-access-4h9m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.214361 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert\") pod \"e577eb4b-1ab6-469f-988e-556ac06e28d3\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.214403 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles\") pod \"e577eb4b-1ab6-469f-988e-556ac06e28d3\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.214462 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca\") pod \"e577eb4b-1ab6-469f-988e-556ac06e28d3\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.214562 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config\") pod \"e577eb4b-1ab6-469f-988e-556ac06e28d3\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215042 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "e577eb4b-1ab6-469f-988e-556ac06e28d3" (UID: "e577eb4b-1ab6-469f-988e-556ac06e28d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215051 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e577eb4b-1ab6-469f-988e-556ac06e28d3" (UID: "e577eb4b-1ab6-469f-988e-556ac06e28d3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215138 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbq4n\" (UniqueName: \"kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n\") pod \"e577eb4b-1ab6-469f-988e-556ac06e28d3\" (UID: \"e577eb4b-1ab6-469f-988e-556ac06e28d3\") " Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215138 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config" (OuterVolumeSpecName: "config") pod "e577eb4b-1ab6-469f-988e-556ac06e28d3" (UID: "e577eb4b-1ab6-469f-988e-556ac06e28d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215542 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9m6\" (UniqueName: \"kubernetes.io/projected/ae8afcca-6de2-42ca-b04c-d55acd3269d4-kube-api-access-4h9m6\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215570 4546 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215581 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215592 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae8afcca-6de2-42ca-b04c-d55acd3269d4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215610 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215620 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e577eb4b-1ab6-469f-988e-556ac06e28d3-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.215629 4546 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae8afcca-6de2-42ca-b04c-d55acd3269d4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.216928 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e577eb4b-1ab6-469f-988e-556ac06e28d3" (UID: "e577eb4b-1ab6-469f-988e-556ac06e28d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.217274 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n" (OuterVolumeSpecName: "kube-api-access-qbq4n") pod "e577eb4b-1ab6-469f-988e-556ac06e28d3" (UID: "e577eb4b-1ab6-469f-988e-556ac06e28d3"). InnerVolumeSpecName "kube-api-access-qbq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.316970 4546 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e577eb4b-1ab6-469f-988e-556ac06e28d3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.317102 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbq4n\" (UniqueName: \"kubernetes.io/projected/e577eb4b-1ab6-469f-988e-556ac06e28d3-kube-api-access-qbq4n\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.383198 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.385995 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-575858cbff-mp49n"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.392806 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.395652 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-775b9f8b87-6q4xv"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.700468 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-ccd68f987-4qtc8"] Feb 01 06:47:50 crc kubenswrapper[4546]: E0201 06:47:50.700710 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" containerName="route-controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.700724 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" containerName="route-controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: E0201 06:47:50.700742 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e577eb4b-1ab6-469f-988e-556ac06e28d3" containerName="controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.700748 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e577eb4b-1ab6-469f-988e-556ac06e28d3" containerName="controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.700896 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" containerName="route-controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.700913 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e577eb4b-1ab6-469f-988e-556ac06e28d3" containerName="controller-manager" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.701296 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.702983 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.703573 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.703841 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.704018 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.704190 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.706290 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.707620 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.708590 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.708963 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.709198 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.709200 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.710529 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.710828 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.710927 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.717921 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ccd68f987-4qtc8"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.719453 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.720801 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp"] Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823193 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5k2\" (UniqueName: \"kubernetes.io/projected/8333373a-0a1c-4b7d-b646-5409a5758abd-kube-api-access-5k5k2\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823266 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-config\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823362 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-client-ca\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823488 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-config\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823778 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l954\" (UniqueName: \"kubernetes.io/projected/3c271196-8738-4a3c-9b67-63e9a7f6c176-kube-api-access-4l954\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823844 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-proxy-ca-bundles\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8333373a-0a1c-4b7d-b646-5409a5758abd-serving-cert\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.823988 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-client-ca\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.824057 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c271196-8738-4a3c-9b67-63e9a7f6c176-serving-cert\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.925538 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l954\" (UniqueName: \"kubernetes.io/projected/3c271196-8738-4a3c-9b67-63e9a7f6c176-kube-api-access-4l954\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.925589 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-proxy-ca-bundles\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.925633 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8333373a-0a1c-4b7d-b646-5409a5758abd-serving-cert\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.925656 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-client-ca\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.925679 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c271196-8738-4a3c-9b67-63e9a7f6c176-serving-cert\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.926490 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5k2\" (UniqueName: \"kubernetes.io/projected/8333373a-0a1c-4b7d-b646-5409a5758abd-kube-api-access-5k5k2\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.926551 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-config\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.926641 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-client-ca\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.926663 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-client-ca\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.926678 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-config\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.927022 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-proxy-ca-bundles\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.927347 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-client-ca\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.927549 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8333373a-0a1c-4b7d-b646-5409a5758abd-config\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.927776 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c271196-8738-4a3c-9b67-63e9a7f6c176-config\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.930662 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8333373a-0a1c-4b7d-b646-5409a5758abd-serving-cert\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.930727 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c271196-8738-4a3c-9b67-63e9a7f6c176-serving-cert\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.940330 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5k2\" (UniqueName: \"kubernetes.io/projected/8333373a-0a1c-4b7d-b646-5409a5758abd-kube-api-access-5k5k2\") pod \"route-controller-manager-6465ddcfcc-nq7tp\" (UID: \"8333373a-0a1c-4b7d-b646-5409a5758abd\") " pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:50 crc kubenswrapper[4546]: I0201 06:47:50.942516 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l954\" (UniqueName: \"kubernetes.io/projected/3c271196-8738-4a3c-9b67-63e9a7f6c176-kube-api-access-4l954\") pod \"controller-manager-ccd68f987-4qtc8\" (UID: \"3c271196-8738-4a3c-9b67-63e9a7f6c176\") " pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.014296 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.020079 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.390231 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ccd68f987-4qtc8"] Feb 01 06:47:51 crc kubenswrapper[4546]: W0201 06:47:51.396210 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c271196_8738_4a3c_9b67_63e9a7f6c176.slice/crio-c2eb4f59bef79398fd21bc174136b2f5700984472934a6018dfcf890c2b58760 WatchSource:0}: Error finding container c2eb4f59bef79398fd21bc174136b2f5700984472934a6018dfcf890c2b58760: Status 404 returned error can't find the container with id c2eb4f59bef79398fd21bc174136b2f5700984472934a6018dfcf890c2b58760 Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.443908 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp"] Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.662220 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8afcca-6de2-42ca-b04c-d55acd3269d4" path="/var/lib/kubelet/pods/ae8afcca-6de2-42ca-b04c-d55acd3269d4/volumes" Feb 01 06:47:51 crc kubenswrapper[4546]: I0201 06:47:51.662975 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e577eb4b-1ab6-469f-988e-556ac06e28d3" path="/var/lib/kubelet/pods/e577eb4b-1ab6-469f-988e-556ac06e28d3/volumes" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.076349 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" event={"ID":"8333373a-0a1c-4b7d-b646-5409a5758abd","Type":"ContainerStarted","Data":"ac0d46243f0e74437d15ad9cd9aa56ab2992e30cb4cdecb7ccbecf118289f83f"} Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.076939 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.076963 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" event={"ID":"8333373a-0a1c-4b7d-b646-5409a5758abd","Type":"ContainerStarted","Data":"c6e5bfc9d9c4fd51476f1b6cb62d78741b0653c609cc1858a1c2ba22249f8c62"} Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.077751 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" event={"ID":"3c271196-8738-4a3c-9b67-63e9a7f6c176","Type":"ContainerStarted","Data":"f5734f8f33c147cd30625cd5cd1664a73efde2b07048ef5bfe37afa2eb20f8b2"} Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.077790 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" event={"ID":"3c271196-8738-4a3c-9b67-63e9a7f6c176","Type":"ContainerStarted","Data":"c2eb4f59bef79398fd21bc174136b2f5700984472934a6018dfcf890c2b58760"} Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.077965 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.082500 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.083191 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.092421 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6465ddcfcc-nq7tp" podStartSLOduration=3.092409894 podStartE2EDuration="3.092409894s" podCreationTimestamp="2026-02-01 06:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:47:52.089775224 +0000 UTC m=+302.740711230" watchObservedRunningTime="2026-02-01 06:47:52.092409894 +0000 UTC m=+302.743345910" Feb 01 06:47:52 crc kubenswrapper[4546]: I0201 06:47:52.104953 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-ccd68f987-4qtc8" podStartSLOduration=3.104942135 podStartE2EDuration="3.104942135s" podCreationTimestamp="2026-02-01 06:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:47:52.101941114 +0000 UTC m=+302.752877130" watchObservedRunningTime="2026-02-01 06:47:52.104942135 +0000 UTC m=+302.755878151" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.202205 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-75tm6"] Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.203252 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.217368 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-75tm6"] Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403184 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4701bf79-3df0-49d3-9728-f484354958f9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403263 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4701bf79-3df0-49d3-9728-f484354958f9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403313 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-bound-sa-token\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403408 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-trusted-ca\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403434 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q64h\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-kube-api-access-9q64h\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403483 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403520 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-registry-tls\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.403553 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-registry-certificates\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.423195 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.506057 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-registry-tls\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.506501 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-registry-certificates\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.506739 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4701bf79-3df0-49d3-9728-f484354958f9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.506848 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4701bf79-3df0-49d3-9728-f484354958f9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.507225 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-bound-sa-token\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.507333 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q64h\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-kube-api-access-9q64h\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.507404 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-trusted-ca\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.507490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4701bf79-3df0-49d3-9728-f484354958f9-ca-trust-extracted\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.508254 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-registry-certificates\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.508995 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4701bf79-3df0-49d3-9728-f484354958f9-trusted-ca\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.513820 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-registry-tls\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.517041 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4701bf79-3df0-49d3-9728-f484354958f9-installation-pull-secrets\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.521451 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-bound-sa-token\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.523557 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q64h\" (UniqueName: \"kubernetes.io/projected/4701bf79-3df0-49d3-9728-f484354958f9-kube-api-access-9q64h\") pod \"image-registry-66df7c8f76-75tm6\" (UID: \"4701bf79-3df0-49d3-9728-f484354958f9\") " pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:57 crc kubenswrapper[4546]: I0201 06:47:57.819680 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:47:58 crc kubenswrapper[4546]: I0201 06:47:58.229850 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-75tm6"] Feb 01 06:47:59 crc kubenswrapper[4546]: I0201 06:47:59.122971 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" event={"ID":"4701bf79-3df0-49d3-9728-f484354958f9","Type":"ContainerStarted","Data":"010bacc944925342d3368db7ac80beab18c845df1d54e89937a32535e0ba9b64"} Feb 01 06:47:59 crc kubenswrapper[4546]: I0201 06:47:59.123047 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" event={"ID":"4701bf79-3df0-49d3-9728-f484354958f9","Type":"ContainerStarted","Data":"9991377bb990d714e3d963023b7a86d2a7fd41f4f2cafcc6629110d81298bd0f"} Feb 01 06:47:59 crc kubenswrapper[4546]: I0201 06:47:59.123185 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:48:17 crc kubenswrapper[4546]: I0201 06:48:17.823312 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" Feb 01 06:48:17 crc kubenswrapper[4546]: I0201 06:48:17.836231 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-75tm6" podStartSLOduration=20.836216816 podStartE2EDuration="20.836216816s" podCreationTimestamp="2026-02-01 06:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:47:59.154493938 +0000 UTC m=+309.805429945" watchObservedRunningTime="2026-02-01 06:48:17.836216816 +0000 UTC m=+328.487152831" Feb 01 06:48:17 crc kubenswrapper[4546]: I0201 06:48:17.854666 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:48:25 crc kubenswrapper[4546]: I0201 06:48:25.420567 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:48:25 crc kubenswrapper[4546]: I0201 06:48:25.420922 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:48:42 crc kubenswrapper[4546]: I0201 06:48:42.881395 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" podUID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" containerName="registry" containerID="cri-o://b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2" gracePeriod=30 Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.217519 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.305678 4546 generic.go:334] "Generic (PLEG): container finished" podID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" containerID="b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2" exitCode=0 Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.305719 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" event={"ID":"813828d1-6b58-42d0-a3e6-b5b0c67423c7","Type":"ContainerDied","Data":"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2"} Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.305751 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" event={"ID":"813828d1-6b58-42d0-a3e6-b5b0c67423c7","Type":"ContainerDied","Data":"4a4a24880595dcd2c8ffd5c245e2260161e96a949299ea346ddd87381d8e4a1c"} Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.305720 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bt5sd" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.305769 4546 scope.go:117] "RemoveContainer" containerID="b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.322526 4546 scope.go:117] "RemoveContainer" containerID="b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2" Feb 01 06:48:43 crc kubenswrapper[4546]: E0201 06:48:43.322806 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2\": container with ID starting with b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2 not found: ID does not exist" containerID="b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.322833 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2"} err="failed to get container status \"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2\": rpc error: code = NotFound desc = could not find container \"b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2\": container with ID starting with b0a44b60016f9573affbadb37413c2ae428580ae4c262e936dca825116a51cc2 not found: ID does not exist" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379339 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhvmj\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379505 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379539 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379566 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379590 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379619 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379653 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.379680 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca\") pod \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\" (UID: \"813828d1-6b58-42d0-a3e6-b5b0c67423c7\") " Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.380538 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.380889 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.386066 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.387182 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.390729 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.394193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj" (OuterVolumeSpecName: "kube-api-access-jhvmj") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "kube-api-access-jhvmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.394761 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.399005 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "813828d1-6b58-42d0-a3e6-b5b0c67423c7" (UID: "813828d1-6b58-42d0-a3e6-b5b0c67423c7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481069 4546 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/813828d1-6b58-42d0-a3e6-b5b0c67423c7-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481100 4546 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481115 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/813828d1-6b58-42d0-a3e6-b5b0c67423c7-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481127 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhvmj\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-kube-api-access-jhvmj\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481137 4546 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481147 4546 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/813828d1-6b58-42d0-a3e6-b5b0c67423c7-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.481155 4546 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/813828d1-6b58-42d0-a3e6-b5b0c67423c7-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.636878 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.643310 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bt5sd"] Feb 01 06:48:43 crc kubenswrapper[4546]: I0201 06:48:43.664892 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" path="/var/lib/kubelet/pods/813828d1-6b58-42d0-a3e6-b5b0c67423c7/volumes" Feb 01 06:48:55 crc kubenswrapper[4546]: I0201 06:48:55.420615 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:48:55 crc kubenswrapper[4546]: I0201 06:48:55.421553 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:49:25 crc kubenswrapper[4546]: I0201 06:49:25.420983 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:49:25 crc kubenswrapper[4546]: I0201 06:49:25.421978 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:49:25 crc kubenswrapper[4546]: I0201 06:49:25.422025 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:49:25 crc kubenswrapper[4546]: I0201 06:49:25.422438 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:49:25 crc kubenswrapper[4546]: I0201 06:49:25.422500 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3" gracePeriod=600 Feb 01 06:49:26 crc kubenswrapper[4546]: I0201 06:49:26.498795 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3" exitCode=0 Feb 01 06:49:26 crc kubenswrapper[4546]: I0201 06:49:26.498888 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3"} Feb 01 06:49:26 crc kubenswrapper[4546]: I0201 06:49:26.499080 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c"} Feb 01 06:49:26 crc kubenswrapper[4546]: I0201 06:49:26.499101 4546 scope.go:117] "RemoveContainer" containerID="32d85847ba44c963a75a6977bfc5b2d34a5ce7590af59b59ac03f260d4767cbf" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.633115 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh"] Feb 01 06:50:21 crc kubenswrapper[4546]: E0201 06:50:21.634714 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" containerName="registry" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.634780 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" containerName="registry" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.634958 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="813828d1-6b58-42d0-a3e6-b5b0c67423c7" containerName="registry" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.635346 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.637492 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.638164 4546 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5pwmw" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.638968 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.642246 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-6rc6w"] Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.642825 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6rc6w" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.647438 4546 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2s954" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.647781 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh"] Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.684805 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6rc6w"] Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.687445 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tmt79"] Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.688488 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.691537 4546 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pls9r" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.692412 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tmt79"] Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.719962 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9gs7\" (UniqueName: \"kubernetes.io/projected/c75ff0e9-5396-4b0b-b848-7b621f1d9a6d-kube-api-access-t9gs7\") pod \"cert-manager-858654f9db-6rc6w\" (UID: \"c75ff0e9-5396-4b0b-b848-7b621f1d9a6d\") " pod="cert-manager/cert-manager-858654f9db-6rc6w" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.720037 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4t6h\" (UniqueName: \"kubernetes.io/projected/c613c234-043b-4e82-8de0-17a10c5ef180-kube-api-access-g4t6h\") pod \"cert-manager-cainjector-cf98fcc89-gq9jh\" (UID: \"c613c234-043b-4e82-8de0-17a10c5ef180\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.720059 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhnk2\" (UniqueName: \"kubernetes.io/projected/511c97f6-90ea-4846-a375-4f5ab35b4d76-kube-api-access-jhnk2\") pod \"cert-manager-webhook-687f57d79b-tmt79\" (UID: \"511c97f6-90ea-4846-a375-4f5ab35b4d76\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.820991 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4t6h\" (UniqueName: \"kubernetes.io/projected/c613c234-043b-4e82-8de0-17a10c5ef180-kube-api-access-g4t6h\") pod \"cert-manager-cainjector-cf98fcc89-gq9jh\" (UID: \"c613c234-043b-4e82-8de0-17a10c5ef180\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.821040 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhnk2\" (UniqueName: \"kubernetes.io/projected/511c97f6-90ea-4846-a375-4f5ab35b4d76-kube-api-access-jhnk2\") pod \"cert-manager-webhook-687f57d79b-tmt79\" (UID: \"511c97f6-90ea-4846-a375-4f5ab35b4d76\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.821093 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9gs7\" (UniqueName: \"kubernetes.io/projected/c75ff0e9-5396-4b0b-b848-7b621f1d9a6d-kube-api-access-t9gs7\") pod \"cert-manager-858654f9db-6rc6w\" (UID: \"c75ff0e9-5396-4b0b-b848-7b621f1d9a6d\") " pod="cert-manager/cert-manager-858654f9db-6rc6w" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.838849 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9gs7\" (UniqueName: \"kubernetes.io/projected/c75ff0e9-5396-4b0b-b848-7b621f1d9a6d-kube-api-access-t9gs7\") pod \"cert-manager-858654f9db-6rc6w\" (UID: \"c75ff0e9-5396-4b0b-b848-7b621f1d9a6d\") " pod="cert-manager/cert-manager-858654f9db-6rc6w" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.838872 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4t6h\" (UniqueName: \"kubernetes.io/projected/c613c234-043b-4e82-8de0-17a10c5ef180-kube-api-access-g4t6h\") pod \"cert-manager-cainjector-cf98fcc89-gq9jh\" (UID: \"c613c234-043b-4e82-8de0-17a10c5ef180\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.839417 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhnk2\" (UniqueName: \"kubernetes.io/projected/511c97f6-90ea-4846-a375-4f5ab35b4d76-kube-api-access-jhnk2\") pod \"cert-manager-webhook-687f57d79b-tmt79\" (UID: \"511c97f6-90ea-4846-a375-4f5ab35b4d76\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.950651 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" Feb 01 06:50:21 crc kubenswrapper[4546]: I0201 06:50:21.956575 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6rc6w" Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.001342 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.123634 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh"] Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.134830 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.374561 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6rc6w"] Feb 01 06:50:22 crc kubenswrapper[4546]: W0201 06:50:22.378163 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75ff0e9_5396_4b0b_b848_7b621f1d9a6d.slice/crio-63fc001ec63f571ce2d29283ddd6980f178b0660de82b69763e53528b3abe1e2 WatchSource:0}: Error finding container 63fc001ec63f571ce2d29283ddd6980f178b0660de82b69763e53528b3abe1e2: Status 404 returned error can't find the container with id 63fc001ec63f571ce2d29283ddd6980f178b0660de82b69763e53528b3abe1e2 Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.400606 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tmt79"] Feb 01 06:50:22 crc kubenswrapper[4546]: W0201 06:50:22.403305 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod511c97f6_90ea_4846_a375_4f5ab35b4d76.slice/crio-b37bc55aeeb4632fff8ea85b6b411810363a2d8efdf1df1545ff8f7996941a7a WatchSource:0}: Error finding container b37bc55aeeb4632fff8ea85b6b411810363a2d8efdf1df1545ff8f7996941a7a: Status 404 returned error can't find the container with id b37bc55aeeb4632fff8ea85b6b411810363a2d8efdf1df1545ff8f7996941a7a Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.745247 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" event={"ID":"511c97f6-90ea-4846-a375-4f5ab35b4d76","Type":"ContainerStarted","Data":"b37bc55aeeb4632fff8ea85b6b411810363a2d8efdf1df1545ff8f7996941a7a"} Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.746294 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" event={"ID":"c613c234-043b-4e82-8de0-17a10c5ef180","Type":"ContainerStarted","Data":"177cfae50cd0008d3ec71ddeb1559280f4fd9d1be0555eff65d33895d24b1805"} Feb 01 06:50:22 crc kubenswrapper[4546]: I0201 06:50:22.747332 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6rc6w" event={"ID":"c75ff0e9-5396-4b0b-b848-7b621f1d9a6d","Type":"ContainerStarted","Data":"63fc001ec63f571ce2d29283ddd6980f178b0660de82b69763e53528b3abe1e2"} Feb 01 06:50:24 crc kubenswrapper[4546]: I0201 06:50:24.759365 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" event={"ID":"c613c234-043b-4e82-8de0-17a10c5ef180","Type":"ContainerStarted","Data":"2f1b4267fd61f26d4328a2091dc027690d44a5a25df52c37481f48120051fd12"} Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.764434 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" event={"ID":"511c97f6-90ea-4846-a375-4f5ab35b4d76","Type":"ContainerStarted","Data":"bc5c6259ec3971e6c8290545cb6370942a895df79f0ca5a5b806189e0b3e1b16"} Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.764563 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.765806 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6rc6w" event={"ID":"c75ff0e9-5396-4b0b-b848-7b621f1d9a6d","Type":"ContainerStarted","Data":"97020c3831091c54ae3a571a26eb599dfb4daf64d2e9471529ffa2ee85b4d537"} Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.779526 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gq9jh" podStartSLOduration=2.922907672 podStartE2EDuration="4.779505645s" podCreationTimestamp="2026-02-01 06:50:21 +0000 UTC" firstStartedPulling="2026-02-01 06:50:22.134414085 +0000 UTC m=+452.785350101" lastFinishedPulling="2026-02-01 06:50:23.991012058 +0000 UTC m=+454.641948074" observedRunningTime="2026-02-01 06:50:24.776350453 +0000 UTC m=+455.427286479" watchObservedRunningTime="2026-02-01 06:50:25.779505645 +0000 UTC m=+456.430441651" Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.780286 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" podStartSLOduration=1.988582354 podStartE2EDuration="4.78028237s" podCreationTimestamp="2026-02-01 06:50:21 +0000 UTC" firstStartedPulling="2026-02-01 06:50:22.405298418 +0000 UTC m=+453.056234434" lastFinishedPulling="2026-02-01 06:50:25.196998434 +0000 UTC m=+455.847934450" observedRunningTime="2026-02-01 06:50:25.776744279 +0000 UTC m=+456.427680295" watchObservedRunningTime="2026-02-01 06:50:25.78028237 +0000 UTC m=+456.431218375" Feb 01 06:50:25 crc kubenswrapper[4546]: I0201 06:50:25.792698 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-6rc6w" podStartSLOduration=1.9821728090000001 podStartE2EDuration="4.792683629s" podCreationTimestamp="2026-02-01 06:50:21 +0000 UTC" firstStartedPulling="2026-02-01 06:50:22.380410938 +0000 UTC m=+453.031346954" lastFinishedPulling="2026-02-01 06:50:25.190921757 +0000 UTC m=+455.841857774" observedRunningTime="2026-02-01 06:50:25.787963871 +0000 UTC m=+456.438899887" watchObservedRunningTime="2026-02-01 06:50:25.792683629 +0000 UTC m=+456.443619646" Feb 01 06:50:32 crc kubenswrapper[4546]: I0201 06:50:32.003606 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-tmt79" Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.979453 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4klz2"] Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980441 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-controller" containerID="cri-o://ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980819 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="sbdb" containerID="cri-o://233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980885 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="nbdb" containerID="cri-o://5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980920 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="northd" containerID="cri-o://fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980952 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.980990 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-node" containerID="cri-o://f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" gracePeriod=30 Feb 01 06:50:44 crc kubenswrapper[4546]: I0201 06:50:44.981024 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-acl-logging" containerID="cri-o://a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" gracePeriod=30 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.014760 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" containerID="cri-o://0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" gracePeriod=30 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.307212 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/3.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.309584 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovn-acl-logging/0.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.310028 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovn-controller/0.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.310568 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.354115 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p4xgh"] Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.354594 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.354655 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.354709 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.354759 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.354808 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="sbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.354851 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="sbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.354930 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="nbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.354975 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="nbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355032 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kubecfg-setup" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355081 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kubecfg-setup" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355134 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-node" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355180 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-node" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355227 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355272 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355319 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355364 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355408 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="northd" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355456 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="northd" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355506 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-acl-logging" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355557 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-acl-logging" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.355605 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355646 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355780 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355833 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="northd" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355914 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="sbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.355966 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356024 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356073 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356116 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356155 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="nbdb" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356198 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-node" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356242 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356287 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovn-acl-logging" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.356410 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356455 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.356502 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356545 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.356667 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerName="ovnkube-controller" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.358193 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380588 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-script-lib\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380624 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-etc-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-node-log\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380653 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380673 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbccj\" (UniqueName: \"kubernetes.io/projected/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-kube-api-access-wbccj\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380686 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovn-node-metrics-cert\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380704 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-var-lib-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380720 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-systemd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380733 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380752 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-systemd-units\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380769 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-log-socket\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380788 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-kubelet\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380803 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-slash\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380817 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-netd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380834 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380878 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-config\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380898 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-ovn\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380920 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-bin\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380935 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-netns\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.380949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-env-overrides\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.481994 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482109 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket" (OuterVolumeSpecName: "log-socket") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482117 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g65h\" (UniqueName: \"kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482157 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482183 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482199 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482216 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482233 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482248 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482267 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482282 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482303 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482323 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482337 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482352 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482367 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482381 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482394 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482410 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482424 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482438 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch\") pod \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\" (UID: \"d4014c65-cdc3-4e2d-a7c3-2ac94248d488\") " Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482494 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-ovn\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482519 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-bin\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482538 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-netns\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482555 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-env-overrides\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482571 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482576 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-script-lib\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482612 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-etc-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482634 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-node-log\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482649 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482668 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbccj\" (UniqueName: \"kubernetes.io/projected/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-kube-api-access-wbccj\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482683 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovn-node-metrics-cert\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482703 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-var-lib-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482719 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-systemd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482733 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482752 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-systemd-units\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482770 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-log-socket\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482786 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-kubelet\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482805 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-slash\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482819 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-netd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482838 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482877 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-config\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482912 4546 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-log-socket\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.482923 4546 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483034 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-systemd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483072 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483142 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483192 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-etc-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483202 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-script-lib\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483226 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-node-log\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483249 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483259 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483274 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash" (OuterVolumeSpecName: "host-slash") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483503 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovnkube-config\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483514 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483549 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-slash\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483573 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-netd\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483592 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483612 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-var-lib-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483632 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-systemd-units\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483645 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483667 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-log-socket\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483652 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-openvswitch\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483695 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-run-ovn\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483725 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483754 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log" (OuterVolumeSpecName: "node-log") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483769 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483805 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-cni-bin\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483837 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-run-netns\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483926 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.483962 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484013 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484089 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484125 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484159 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484181 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-env-overrides\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.484248 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-host-kubelet\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.489171 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-ovn-node-metrics-cert\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.489179 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.489516 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h" (OuterVolumeSpecName: "kube-api-access-2g65h") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "kube-api-access-2g65h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.495451 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d4014c65-cdc3-4e2d-a7c3-2ac94248d488" (UID: "d4014c65-cdc3-4e2d-a7c3-2ac94248d488"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.498433 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbccj\" (UniqueName: \"kubernetes.io/projected/4539d500-e3f4-495a-8dfe-7dbfd8f4338b-kube-api-access-wbccj\") pod \"ovnkube-node-p4xgh\" (UID: \"4539d500-e3f4-495a-8dfe-7dbfd8f4338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583344 4546 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583373 4546 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583389 4546 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583398 4546 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583410 4546 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583419 4546 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583429 4546 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583440 4546 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-slash\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583448 4546 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583456 4546 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583463 4546 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583474 4546 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583482 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g65h\" (UniqueName: \"kubernetes.io/projected/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-kube-api-access-2g65h\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583490 4546 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583497 4546 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583504 4546 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-node-log\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583513 4546 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.583520 4546 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d4014c65-cdc3-4e2d-a7c3-2ac94248d488-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.669704 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:45 crc kubenswrapper[4546]: W0201 06:50:45.696387 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4539d500_e3f4_495a_8dfe_7dbfd8f4338b.slice/crio-54297c9cc92170f0b55e89415188143b6eb3eb482567c4a24f40467ced6d5008 WatchSource:0}: Error finding container 54297c9cc92170f0b55e89415188143b6eb3eb482567c4a24f40467ced6d5008: Status 404 returned error can't find the container with id 54297c9cc92170f0b55e89415188143b6eb3eb482567c4a24f40467ced6d5008 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.854566 4546 generic.go:334] "Generic (PLEG): container finished" podID="4539d500-e3f4-495a-8dfe-7dbfd8f4338b" containerID="29a68895001f45013b7284047f8e90797a414f970fbbb91f2d5c42b6c206d05e" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.854637 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerDied","Data":"29a68895001f45013b7284047f8e90797a414f970fbbb91f2d5c42b6c206d05e"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.854827 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"54297c9cc92170f0b55e89415188143b6eb3eb482567c4a24f40467ced6d5008"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.856629 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/1.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.857195 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/0.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.857293 4546 generic.go:334] "Generic (PLEG): container finished" podID="95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16" containerID="6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b" exitCode=2 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.857392 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerDied","Data":"6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.857471 4546 scope.go:117] "RemoveContainer" containerID="bd4e1ff59a1a78fca318ed0ebf19a3c7b7e19f7a97b0a3a9b6ab46fbd3a94271" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.857966 4546 scope.go:117] "RemoveContainer" containerID="6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b" Feb 01 06:50:45 crc kubenswrapper[4546]: E0201 06:50:45.858245 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nwmnb_openshift-multus(95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16)\"" pod="openshift-multus/multus-nwmnb" podUID="95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.863452 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovnkube-controller/3.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.865606 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovn-acl-logging/0.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866071 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4klz2_d4014c65-cdc3-4e2d-a7c3-2ac94248d488/ovn-controller/0.log" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866440 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866475 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866486 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866496 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866505 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866513 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" exitCode=0 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866514 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866523 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" exitCode=143 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866534 4546 generic.go:334] "Generic (PLEG): container finished" podID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" exitCode=143 Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866570 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866617 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866629 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866640 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866651 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866662 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866674 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866689 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866694 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866700 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866705 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866710 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866716 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866721 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866728 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866734 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866744 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866754 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866762 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866768 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866776 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866781 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866786 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866792 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866797 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866804 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866809 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866817 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866827 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866834 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866840 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866846 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866852 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866882 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866891 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866897 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866905 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866911 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866919 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4klz2" event={"ID":"d4014c65-cdc3-4e2d-a7c3-2ac94248d488","Type":"ContainerDied","Data":"b971ac7f229b44da93e305c9ae68ebcfe0d1f79ff970693247e95d72aef3bbda"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866928 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866934 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866940 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866946 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866951 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866956 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866962 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866966 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866971 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.866975 4546 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.886805 4546 scope.go:117] "RemoveContainer" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.916321 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.917944 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4klz2"] Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.920392 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4klz2"] Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.939775 4546 scope.go:117] "RemoveContainer" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.951328 4546 scope.go:117] "RemoveContainer" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.968074 4546 scope.go:117] "RemoveContainer" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:45 crc kubenswrapper[4546]: I0201 06:50:45.984381 4546 scope.go:117] "RemoveContainer" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.002141 4546 scope.go:117] "RemoveContainer" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.013943 4546 scope.go:117] "RemoveContainer" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.035532 4546 scope.go:117] "RemoveContainer" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.048967 4546 scope.go:117] "RemoveContainer" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.070689 4546 scope.go:117] "RemoveContainer" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.072237 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": container with ID starting with 0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20 not found: ID does not exist" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.072292 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} err="failed to get container status \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": rpc error: code = NotFound desc = could not find container \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": container with ID starting with 0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.072322 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.073357 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": container with ID starting with f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2 not found: ID does not exist" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.073402 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} err="failed to get container status \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": rpc error: code = NotFound desc = could not find container \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": container with ID starting with f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.073424 4546 scope.go:117] "RemoveContainer" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.074545 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": container with ID starting with 233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260 not found: ID does not exist" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.074642 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} err="failed to get container status \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": rpc error: code = NotFound desc = could not find container \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": container with ID starting with 233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.074746 4546 scope.go:117] "RemoveContainer" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.075294 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": container with ID starting with 5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727 not found: ID does not exist" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.075357 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} err="failed to get container status \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": rpc error: code = NotFound desc = could not find container \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": container with ID starting with 5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.075396 4546 scope.go:117] "RemoveContainer" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.076687 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": container with ID starting with fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b not found: ID does not exist" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.076723 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} err="failed to get container status \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": rpc error: code = NotFound desc = could not find container \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": container with ID starting with fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.076747 4546 scope.go:117] "RemoveContainer" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.077976 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": container with ID starting with 378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6 not found: ID does not exist" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078027 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} err="failed to get container status \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": rpc error: code = NotFound desc = could not find container \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": container with ID starting with 378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078049 4546 scope.go:117] "RemoveContainer" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.078553 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": container with ID starting with f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789 not found: ID does not exist" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078590 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} err="failed to get container status \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": rpc error: code = NotFound desc = could not find container \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": container with ID starting with f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078608 4546 scope.go:117] "RemoveContainer" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.078928 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": container with ID starting with a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970 not found: ID does not exist" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078961 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} err="failed to get container status \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": rpc error: code = NotFound desc = could not find container \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": container with ID starting with a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.078991 4546 scope.go:117] "RemoveContainer" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.079254 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": container with ID starting with ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d not found: ID does not exist" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079282 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} err="failed to get container status \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": rpc error: code = NotFound desc = could not find container \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": container with ID starting with ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079297 4546 scope.go:117] "RemoveContainer" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: E0201 06:50:46.079562 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": container with ID starting with 634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5 not found: ID does not exist" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079590 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} err="failed to get container status \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": rpc error: code = NotFound desc = could not find container \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": container with ID starting with 634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079607 4546 scope.go:117] "RemoveContainer" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079813 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} err="failed to get container status \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": rpc error: code = NotFound desc = could not find container \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": container with ID starting with 0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.079834 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080105 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} err="failed to get container status \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": rpc error: code = NotFound desc = could not find container \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": container with ID starting with f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080132 4546 scope.go:117] "RemoveContainer" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080369 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} err="failed to get container status \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": rpc error: code = NotFound desc = could not find container \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": container with ID starting with 233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080390 4546 scope.go:117] "RemoveContainer" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080584 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} err="failed to get container status \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": rpc error: code = NotFound desc = could not find container \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": container with ID starting with 5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080607 4546 scope.go:117] "RemoveContainer" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080771 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} err="failed to get container status \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": rpc error: code = NotFound desc = could not find container \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": container with ID starting with fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080791 4546 scope.go:117] "RemoveContainer" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.080995 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} err="failed to get container status \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": rpc error: code = NotFound desc = could not find container \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": container with ID starting with 378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081015 4546 scope.go:117] "RemoveContainer" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081230 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} err="failed to get container status \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": rpc error: code = NotFound desc = could not find container \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": container with ID starting with f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081251 4546 scope.go:117] "RemoveContainer" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081438 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} err="failed to get container status \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": rpc error: code = NotFound desc = could not find container \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": container with ID starting with a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081459 4546 scope.go:117] "RemoveContainer" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081645 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} err="failed to get container status \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": rpc error: code = NotFound desc = could not find container \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": container with ID starting with ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081665 4546 scope.go:117] "RemoveContainer" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081848 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} err="failed to get container status \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": rpc error: code = NotFound desc = could not find container \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": container with ID starting with 634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.081885 4546 scope.go:117] "RemoveContainer" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082077 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} err="failed to get container status \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": rpc error: code = NotFound desc = could not find container \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": container with ID starting with 0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082096 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082390 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} err="failed to get container status \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": rpc error: code = NotFound desc = could not find container \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": container with ID starting with f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082411 4546 scope.go:117] "RemoveContainer" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082640 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} err="failed to get container status \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": rpc error: code = NotFound desc = could not find container \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": container with ID starting with 233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082660 4546 scope.go:117] "RemoveContainer" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082866 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} err="failed to get container status \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": rpc error: code = NotFound desc = could not find container \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": container with ID starting with 5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.082886 4546 scope.go:117] "RemoveContainer" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083123 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} err="failed to get container status \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": rpc error: code = NotFound desc = could not find container \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": container with ID starting with fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083146 4546 scope.go:117] "RemoveContainer" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083362 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} err="failed to get container status \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": rpc error: code = NotFound desc = could not find container \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": container with ID starting with 378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083396 4546 scope.go:117] "RemoveContainer" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083669 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} err="failed to get container status \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": rpc error: code = NotFound desc = could not find container \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": container with ID starting with f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083690 4546 scope.go:117] "RemoveContainer" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083869 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} err="failed to get container status \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": rpc error: code = NotFound desc = could not find container \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": container with ID starting with a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.083889 4546 scope.go:117] "RemoveContainer" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084093 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} err="failed to get container status \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": rpc error: code = NotFound desc = could not find container \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": container with ID starting with ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084113 4546 scope.go:117] "RemoveContainer" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084298 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} err="failed to get container status \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": rpc error: code = NotFound desc = could not find container \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": container with ID starting with 634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084318 4546 scope.go:117] "RemoveContainer" containerID="0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084647 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20"} err="failed to get container status \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": rpc error: code = NotFound desc = could not find container \"0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20\": container with ID starting with 0f7c90c0df6597d0d4e7e55cfe4a537f3b9af7a68a0021026aec7430c61e0e20 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084667 4546 scope.go:117] "RemoveContainer" containerID="f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084851 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2"} err="failed to get container status \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": rpc error: code = NotFound desc = could not find container \"f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2\": container with ID starting with f5a6e7bc07b2f8545ee7f01ff273ccc0f34aa1004dc2858d287fdeaff24ac1e2 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.084888 4546 scope.go:117] "RemoveContainer" containerID="233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085183 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260"} err="failed to get container status \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": rpc error: code = NotFound desc = could not find container \"233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260\": container with ID starting with 233699692f972563a46171abf6856f9f66d3c62a8a7388c9feaae0a49563e260 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085203 4546 scope.go:117] "RemoveContainer" containerID="5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085420 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727"} err="failed to get container status \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": rpc error: code = NotFound desc = could not find container \"5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727\": container with ID starting with 5f1ede2cff7d09c412f7bd6b1275b75a6a946d85aa1186fbf7a8e16ceb3e3727 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085439 4546 scope.go:117] "RemoveContainer" containerID="fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085619 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b"} err="failed to get container status \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": rpc error: code = NotFound desc = could not find container \"fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b\": container with ID starting with fb802901252d11796c8eff9a54f206f64d1e762858f697aae97b1f7db057a58b not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085639 4546 scope.go:117] "RemoveContainer" containerID="378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085892 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6"} err="failed to get container status \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": rpc error: code = NotFound desc = could not find container \"378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6\": container with ID starting with 378eeca042f111f5564ca22b2c3cfdfc149ac43144aeb356049314649211d4c6 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.085913 4546 scope.go:117] "RemoveContainer" containerID="f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086306 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789"} err="failed to get container status \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": rpc error: code = NotFound desc = could not find container \"f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789\": container with ID starting with f97979dbdf97130703263a2ab6f3b9afba9eea8a566fb28972c710ba5d407789 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086327 4546 scope.go:117] "RemoveContainer" containerID="a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086642 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970"} err="failed to get container status \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": rpc error: code = NotFound desc = could not find container \"a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970\": container with ID starting with a96c0ee1667691a8f101d6c7314c5ea1618c2e166492b7805109a587462c7970 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086663 4546 scope.go:117] "RemoveContainer" containerID="ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086884 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d"} err="failed to get container status \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": rpc error: code = NotFound desc = could not find container \"ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d\": container with ID starting with ea50b6d0f549e05c42f82d9a8a4182969b34b37bebdc2440c6742abf2a253b1d not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.086904 4546 scope.go:117] "RemoveContainer" containerID="634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.087815 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5"} err="failed to get container status \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": rpc error: code = NotFound desc = could not find container \"634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5\": container with ID starting with 634e27a69e7c44878a19a02a67b98417dc7c363500bcae8ccd81196cd5189fc5 not found: ID does not exist" Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.874765 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"3a2e76bf00a62e90d695352bf0446679c1d6867357aad113b05b6de55a2c638b"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.875093 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"2904eeaa9ad134545be79edb8b98156bcf7bad4277f027486d352e065f9e38d6"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.875104 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"bdb6a2b695f98be5685e66397f5d7c54a927bfb347ed4c26abd64132bf355e24"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.875114 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"fc652fc07149fb0deafdd513075732c654f723d2e375100628ba6ec4be2cd53f"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.875122 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"83442a6e614a5309751a1341b263878e2a2bd5faafeb6ff58081f84ecd9e664a"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.875130 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"b1e9cddd0b18329a2340a29979a955b4db0d88f92e6c1ee3d6f4979ee7bcf813"} Feb 01 06:50:46 crc kubenswrapper[4546]: I0201 06:50:46.876570 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/1.log" Feb 01 06:50:47 crc kubenswrapper[4546]: I0201 06:50:47.661688 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4014c65-cdc3-4e2d-a7c3-2ac94248d488" path="/var/lib/kubelet/pods/d4014c65-cdc3-4e2d-a7c3-2ac94248d488/volumes" Feb 01 06:50:48 crc kubenswrapper[4546]: I0201 06:50:48.897025 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"2a2c0f39d4948be461275ce97419d97a29b855d992b57aad29338733d3d65684"} Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.921508 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" event={"ID":"4539d500-e3f4-495a-8dfe-7dbfd8f4338b","Type":"ContainerStarted","Data":"6dbd40037bd196a557ee64a6a50195f1685b633b94f2bd8e6e095bcb7d06bade"} Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.922188 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.922202 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.922212 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.950277 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.950745 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:50:50 crc kubenswrapper[4546]: I0201 06:50:50.964289 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" podStartSLOduration=5.964270951 podStartE2EDuration="5.964270951s" podCreationTimestamp="2026-02-01 06:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:50.961728257 +0000 UTC m=+481.612664273" watchObservedRunningTime="2026-02-01 06:50:50.964270951 +0000 UTC m=+481.615206967" Feb 01 06:50:58 crc kubenswrapper[4546]: I0201 06:50:58.654403 4546 scope.go:117] "RemoveContainer" containerID="6f28fb3805758653fa09744e86e247e8c38933d24f9033588fc9e61610246d9b" Feb 01 06:50:58 crc kubenswrapper[4546]: I0201 06:50:58.958288 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmnb_95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16/kube-multus/1.log" Feb 01 06:50:58 crc kubenswrapper[4546]: I0201 06:50:58.958596 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmnb" event={"ID":"95ec9f1f-bcd2-46f1-9c28-b5abf6d67a16","Type":"ContainerStarted","Data":"c2eca83567d9fdbc31a2b436be877266e2986bc6e772eab7ff5c2c440d4293dd"} Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.475123 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx"] Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.476143 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.478743 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.489261 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx"] Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.595359 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlg7\" (UniqueName: \"kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.595442 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.595496 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.696339 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlg7\" (UniqueName: \"kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.696410 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.696457 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.696994 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.697231 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.716055 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlg7\" (UniqueName: \"kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:02 crc kubenswrapper[4546]: I0201 06:51:02.797667 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:03 crc kubenswrapper[4546]: I0201 06:51:03.164729 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx"] Feb 01 06:51:03 crc kubenswrapper[4546]: W0201 06:51:03.170608 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00723a4e_bb45_4076_92a8_7d7be40583e8.slice/crio-228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028 WatchSource:0}: Error finding container 228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028: Status 404 returned error can't find the container with id 228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028 Feb 01 06:51:03 crc kubenswrapper[4546]: I0201 06:51:03.989090 4546 generic.go:334] "Generic (PLEG): container finished" podID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerID="82c5e93f64a352fda3c739e08fba362bf91cc632df61b8c1546de0dc5b0cddd7" exitCode=0 Feb 01 06:51:03 crc kubenswrapper[4546]: I0201 06:51:03.989151 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" event={"ID":"00723a4e-bb45-4076-92a8-7d7be40583e8","Type":"ContainerDied","Data":"82c5e93f64a352fda3c739e08fba362bf91cc632df61b8c1546de0dc5b0cddd7"} Feb 01 06:51:03 crc kubenswrapper[4546]: I0201 06:51:03.989181 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" event={"ID":"00723a4e-bb45-4076-92a8-7d7be40583e8","Type":"ContainerStarted","Data":"228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028"} Feb 01 06:51:06 crc kubenswrapper[4546]: I0201 06:51:06.011353 4546 generic.go:334] "Generic (PLEG): container finished" podID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerID="bda4602a68885181e92d9130dd12fbf671c6fad7c2fc3dc3786e90f4abd8968a" exitCode=0 Feb 01 06:51:06 crc kubenswrapper[4546]: I0201 06:51:06.011505 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" event={"ID":"00723a4e-bb45-4076-92a8-7d7be40583e8","Type":"ContainerDied","Data":"bda4602a68885181e92d9130dd12fbf671c6fad7c2fc3dc3786e90f4abd8968a"} Feb 01 06:51:07 crc kubenswrapper[4546]: I0201 06:51:07.021954 4546 generic.go:334] "Generic (PLEG): container finished" podID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerID="221d4457a4f9a4a77ce3952d2b03a7dba792f0434037a85a8c96d9ec6735efd6" exitCode=0 Feb 01 06:51:07 crc kubenswrapper[4546]: I0201 06:51:07.022032 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" event={"ID":"00723a4e-bb45-4076-92a8-7d7be40583e8","Type":"ContainerDied","Data":"221d4457a4f9a4a77ce3952d2b03a7dba792f0434037a85a8c96d9ec6735efd6"} Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.198910 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.261415 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krlg7\" (UniqueName: \"kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7\") pod \"00723a4e-bb45-4076-92a8-7d7be40583e8\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.261487 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util\") pod \"00723a4e-bb45-4076-92a8-7d7be40583e8\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.261527 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle\") pod \"00723a4e-bb45-4076-92a8-7d7be40583e8\" (UID: \"00723a4e-bb45-4076-92a8-7d7be40583e8\") " Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.262001 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle" (OuterVolumeSpecName: "bundle") pod "00723a4e-bb45-4076-92a8-7d7be40583e8" (UID: "00723a4e-bb45-4076-92a8-7d7be40583e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.266168 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7" (OuterVolumeSpecName: "kube-api-access-krlg7") pod "00723a4e-bb45-4076-92a8-7d7be40583e8" (UID: "00723a4e-bb45-4076-92a8-7d7be40583e8"). InnerVolumeSpecName "kube-api-access-krlg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.272192 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util" (OuterVolumeSpecName: "util") pod "00723a4e-bb45-4076-92a8-7d7be40583e8" (UID: "00723a4e-bb45-4076-92a8-7d7be40583e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.363246 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krlg7\" (UniqueName: \"kubernetes.io/projected/00723a4e-bb45-4076-92a8-7d7be40583e8-kube-api-access-krlg7\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.363287 4546 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-util\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:08 crc kubenswrapper[4546]: I0201 06:51:08.363301 4546 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00723a4e-bb45-4076-92a8-7d7be40583e8-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:09 crc kubenswrapper[4546]: I0201 06:51:09.034650 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" event={"ID":"00723a4e-bb45-4076-92a8-7d7be40583e8","Type":"ContainerDied","Data":"228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028"} Feb 01 06:51:09 crc kubenswrapper[4546]: I0201 06:51:09.034713 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228399bb86bc732cdf87eb6c9db82f2a0c8bc6516563292472a18ab0db5f9028" Feb 01 06:51:09 crc kubenswrapper[4546]: I0201 06:51:09.035029 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7136klpx" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.378314 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5d9cr"] Feb 01 06:51:10 crc kubenswrapper[4546]: E0201 06:51:10.378874 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="extract" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.378888 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="extract" Feb 01 06:51:10 crc kubenswrapper[4546]: E0201 06:51:10.378896 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="pull" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.378903 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="pull" Feb 01 06:51:10 crc kubenswrapper[4546]: E0201 06:51:10.378917 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="util" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.378923 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="util" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.379027 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="00723a4e-bb45-4076-92a8-7d7be40583e8" containerName="extract" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.379456 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.380916 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bndqb" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.382673 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.385348 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqx4\" (UniqueName: \"kubernetes.io/projected/1096de9c-ea0e-434d-b9a8-6927c18b8bab-kube-api-access-vcqx4\") pod \"nmstate-operator-646758c888-5d9cr\" (UID: \"1096de9c-ea0e-434d-b9a8-6927c18b8bab\") " pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.385700 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.394564 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5d9cr"] Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.487203 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqx4\" (UniqueName: \"kubernetes.io/projected/1096de9c-ea0e-434d-b9a8-6927c18b8bab-kube-api-access-vcqx4\") pod \"nmstate-operator-646758c888-5d9cr\" (UID: \"1096de9c-ea0e-434d-b9a8-6927c18b8bab\") " pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.513715 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqx4\" (UniqueName: \"kubernetes.io/projected/1096de9c-ea0e-434d-b9a8-6927c18b8bab-kube-api-access-vcqx4\") pod \"nmstate-operator-646758c888-5d9cr\" (UID: \"1096de9c-ea0e-434d-b9a8-6927c18b8bab\") " pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.690332 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" Feb 01 06:51:10 crc kubenswrapper[4546]: I0201 06:51:10.880301 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-5d9cr"] Feb 01 06:51:11 crc kubenswrapper[4546]: I0201 06:51:11.046176 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" event={"ID":"1096de9c-ea0e-434d-b9a8-6927c18b8bab","Type":"ContainerStarted","Data":"f4ebf016745d8008e5269771a1b8f0a0b6f5ffb1d5219881069aa2112264b120"} Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.059474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" event={"ID":"1096de9c-ea0e-434d-b9a8-6927c18b8bab","Type":"ContainerStarted","Data":"7c6ce1e519024c01ddfe4cf70a15f19664684521198ef3e06d44cd0c344d1aec"} Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.076805 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-5d9cr" podStartSLOduration=1.13581504 podStartE2EDuration="3.076783223s" podCreationTimestamp="2026-02-01 06:51:10 +0000 UTC" firstStartedPulling="2026-02-01 06:51:10.893486893 +0000 UTC m=+501.544422909" lastFinishedPulling="2026-02-01 06:51:12.834455076 +0000 UTC m=+503.485391092" observedRunningTime="2026-02-01 06:51:13.074210233 +0000 UTC m=+503.725146259" watchObservedRunningTime="2026-02-01 06:51:13.076783223 +0000 UTC m=+503.727719238" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.878350 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-245tj"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.879522 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.881281 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gt7bh" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.889436 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.890228 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.900458 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-245tj"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.901622 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.916736 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.917596 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wst5k"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.918319 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.998668 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp"] Feb 01 06:51:13 crc kubenswrapper[4546]: I0201 06:51:13.999399 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.001262 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.003205 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xrf9z" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.003408 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.039387 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp"] Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042312 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-ovs-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042361 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6afb50b9-8605-41e4-a7b2-9511c908663e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042393 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ds4p\" (UniqueName: \"kubernetes.io/projected/eeeb655a-c1d4-4d94-95cb-252208486e23-kube-api-access-4ds4p\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042416 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85f7\" (UniqueName: \"kubernetes.io/projected/6afb50b9-8605-41e4-a7b2-9511c908663e-kube-api-access-t85f7\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042446 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-nmstate-lock\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042480 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gp6\" (UniqueName: \"kubernetes.io/projected/916742b1-ba9e-4c96-9b2a-90a93854b8a6-kube-api-access-b8gp6\") pod \"nmstate-metrics-54757c584b-245tj\" (UID: \"916742b1-ba9e-4c96-9b2a-90a93854b8a6\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.042516 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-dbus-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144413 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-nmstate-lock\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144465 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gp6\" (UniqueName: \"kubernetes.io/projected/916742b1-ba9e-4c96-9b2a-90a93854b8a6-kube-api-access-b8gp6\") pod \"nmstate-metrics-54757c584b-245tj\" (UID: \"916742b1-ba9e-4c96-9b2a-90a93854b8a6\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144498 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3950d2e9-eda7-4334-a1e1-1513629c662c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144526 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-dbus-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144565 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3950d2e9-eda7-4334-a1e1-1513629c662c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144593 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-ovs-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144622 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6afb50b9-8605-41e4-a7b2-9511c908663e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144650 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ds4p\" (UniqueName: \"kubernetes.io/projected/eeeb655a-c1d4-4d94-95cb-252208486e23-kube-api-access-4ds4p\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144670 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t85f7\" (UniqueName: \"kubernetes.io/projected/6afb50b9-8605-41e4-a7b2-9511c908663e-kube-api-access-t85f7\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.144694 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rsb\" (UniqueName: \"kubernetes.io/projected/3950d2e9-eda7-4334-a1e1-1513629c662c-kube-api-access-b5rsb\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.145112 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-nmstate-lock\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.145507 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-ovs-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.145810 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eeeb655a-c1d4-4d94-95cb-252208486e23-dbus-socket\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.161490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85f7\" (UniqueName: \"kubernetes.io/projected/6afb50b9-8605-41e4-a7b2-9511c908663e-kube-api-access-t85f7\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.165929 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6afb50b9-8605-41e4-a7b2-9511c908663e-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-ftxkp\" (UID: \"6afb50b9-8605-41e4-a7b2-9511c908663e\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.171680 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gp6\" (UniqueName: \"kubernetes.io/projected/916742b1-ba9e-4c96-9b2a-90a93854b8a6-kube-api-access-b8gp6\") pod \"nmstate-metrics-54757c584b-245tj\" (UID: \"916742b1-ba9e-4c96-9b2a-90a93854b8a6\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.191171 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ds4p\" (UniqueName: \"kubernetes.io/projected/eeeb655a-c1d4-4d94-95cb-252208486e23-kube-api-access-4ds4p\") pod \"nmstate-handler-wst5k\" (UID: \"eeeb655a-c1d4-4d94-95cb-252208486e23\") " pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.198443 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.206511 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.233223 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f68d6bfd5-njtfx"] Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.233600 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.234239 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246224 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rsb\" (UniqueName: \"kubernetes.io/projected/3950d2e9-eda7-4334-a1e1-1513629c662c-kube-api-access-b5rsb\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246270 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-trusted-ca-bundle\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246293 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-service-ca\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246313 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-oauth-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246330 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3950d2e9-eda7-4334-a1e1-1513629c662c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246350 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-oauth-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246368 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246386 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246407 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3950d2e9-eda7-4334-a1e1-1513629c662c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.246433 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrssn\" (UniqueName: \"kubernetes.io/projected/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-kube-api-access-vrssn\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.249930 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3950d2e9-eda7-4334-a1e1-1513629c662c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.254754 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3950d2e9-eda7-4334-a1e1-1513629c662c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.255946 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f68d6bfd5-njtfx"] Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.272833 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rsb\" (UniqueName: \"kubernetes.io/projected/3950d2e9-eda7-4334-a1e1-1513629c662c-kube-api-access-b5rsb\") pod \"nmstate-console-plugin-7754f76f8b-kr6pp\" (UID: \"3950d2e9-eda7-4334-a1e1-1513629c662c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: W0201 06:51:14.277917 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeeb655a_c1d4_4d94_95cb_252208486e23.slice/crio-fd712a500a521c275371c093a4d9d6b2b004676ba0fe7c3ac4fa57fca6023fb5 WatchSource:0}: Error finding container fd712a500a521c275371c093a4d9d6b2b004676ba0fe7c3ac4fa57fca6023fb5: Status 404 returned error can't find the container with id fd712a500a521c275371c093a4d9d6b2b004676ba0fe7c3ac4fa57fca6023fb5 Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.313712 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348674 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-service-ca\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348869 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-oauth-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348898 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-oauth-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348918 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348938 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.348968 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrssn\" (UniqueName: \"kubernetes.io/projected/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-kube-api-access-vrssn\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.349015 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-trusted-ca-bundle\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.349905 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-trusted-ca-bundle\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.350720 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-service-ca\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.353583 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-oauth-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.354140 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-config\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.354684 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-oauth-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.355605 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-console-serving-cert\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.372814 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrssn\" (UniqueName: \"kubernetes.io/projected/1c14859d-7a46-4a3c-9ae2-bcdca6a804b6-kube-api-access-vrssn\") pod \"console-f68d6bfd5-njtfx\" (UID: \"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6\") " pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.443144 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-245tj"] Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.529358 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp"] Feb 01 06:51:14 crc kubenswrapper[4546]: W0201 06:51:14.532979 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3950d2e9_eda7_4334_a1e1_1513629c662c.slice/crio-854d30c60db26544f74eae0298c89fbcffea97f743b3dd739241e8f595ebc583 WatchSource:0}: Error finding container 854d30c60db26544f74eae0298c89fbcffea97f743b3dd739241e8f595ebc583: Status 404 returned error can't find the container with id 854d30c60db26544f74eae0298c89fbcffea97f743b3dd739241e8f595ebc583 Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.552124 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.713625 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f68d6bfd5-njtfx"] Feb 01 06:51:14 crc kubenswrapper[4546]: I0201 06:51:14.718647 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp"] Feb 01 06:51:14 crc kubenswrapper[4546]: W0201 06:51:14.720217 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c14859d_7a46_4a3c_9ae2_bcdca6a804b6.slice/crio-ef1fcc22d493ab9d000f047537352014b8cf47c836f593ef22921db94fe5d977 WatchSource:0}: Error finding container ef1fcc22d493ab9d000f047537352014b8cf47c836f593ef22921db94fe5d977: Status 404 returned error can't find the container with id ef1fcc22d493ab9d000f047537352014b8cf47c836f593ef22921db94fe5d977 Feb 01 06:51:14 crc kubenswrapper[4546]: W0201 06:51:14.722089 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afb50b9_8605_41e4_a7b2_9511c908663e.slice/crio-8a990735cb2777cab828442bdcc95396683207722a805b2b9b4832c0430280d3 WatchSource:0}: Error finding container 8a990735cb2777cab828442bdcc95396683207722a805b2b9b4832c0430280d3: Status 404 returned error can't find the container with id 8a990735cb2777cab828442bdcc95396683207722a805b2b9b4832c0430280d3 Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.072198 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" event={"ID":"916742b1-ba9e-4c96-9b2a-90a93854b8a6","Type":"ContainerStarted","Data":"4f2aaf961ba828344c674e5958d850825e69da672b0c4c521192a22fdad0d43f"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.073595 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" event={"ID":"3950d2e9-eda7-4334-a1e1-1513629c662c","Type":"ContainerStarted","Data":"854d30c60db26544f74eae0298c89fbcffea97f743b3dd739241e8f595ebc583"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.075003 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" event={"ID":"6afb50b9-8605-41e4-a7b2-9511c908663e","Type":"ContainerStarted","Data":"8a990735cb2777cab828442bdcc95396683207722a805b2b9b4832c0430280d3"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.076283 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wst5k" event={"ID":"eeeb655a-c1d4-4d94-95cb-252208486e23","Type":"ContainerStarted","Data":"fd712a500a521c275371c093a4d9d6b2b004676ba0fe7c3ac4fa57fca6023fb5"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.077810 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f68d6bfd5-njtfx" event={"ID":"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6","Type":"ContainerStarted","Data":"3502c17bfa8f547e95a63c95d378e5ddbfcc27be937362d41b4dfe93c5e56436"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.077842 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f68d6bfd5-njtfx" event={"ID":"1c14859d-7a46-4a3c-9ae2-bcdca6a804b6","Type":"ContainerStarted","Data":"ef1fcc22d493ab9d000f047537352014b8cf47c836f593ef22921db94fe5d977"} Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.095389 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f68d6bfd5-njtfx" podStartSLOduration=1.095373689 podStartE2EDuration="1.095373689s" podCreationTimestamp="2026-02-01 06:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:51:15.091738728 +0000 UTC m=+505.742674743" watchObservedRunningTime="2026-02-01 06:51:15.095373689 +0000 UTC m=+505.746309695" Feb 01 06:51:15 crc kubenswrapper[4546]: I0201 06:51:15.696317 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p4xgh" Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.104804 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" event={"ID":"6afb50b9-8605-41e4-a7b2-9511c908663e","Type":"ContainerStarted","Data":"b25c1ee19a1a5267222e8710215c3f87dcd5cea6fdaa964f593a9fc28ce90097"} Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.105581 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.107272 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wst5k" event={"ID":"eeeb655a-c1d4-4d94-95cb-252208486e23","Type":"ContainerStarted","Data":"4dabda63bdb0a04a73fca4a0988de178c71e81cd4fe7ea24fcec7b69b58cbd04"} Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.107657 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.129712 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" podStartSLOduration=2.7201702279999997 podStartE2EDuration="5.129688844s" podCreationTimestamp="2026-02-01 06:51:13 +0000 UTC" firstStartedPulling="2026-02-01 06:51:14.724961745 +0000 UTC m=+505.375897761" lastFinishedPulling="2026-02-01 06:51:17.13448036 +0000 UTC m=+507.785416377" observedRunningTime="2026-02-01 06:51:18.120093536 +0000 UTC m=+508.771029572" watchObservedRunningTime="2026-02-01 06:51:18.129688844 +0000 UTC m=+508.780624860" Feb 01 06:51:18 crc kubenswrapper[4546]: I0201 06:51:18.144983 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wst5k" podStartSLOduration=1.718930235 podStartE2EDuration="5.144970402s" podCreationTimestamp="2026-02-01 06:51:13 +0000 UTC" firstStartedPulling="2026-02-01 06:51:14.284887752 +0000 UTC m=+504.935823768" lastFinishedPulling="2026-02-01 06:51:17.710927918 +0000 UTC m=+508.361863935" observedRunningTime="2026-02-01 06:51:18.142527376 +0000 UTC m=+508.793463392" watchObservedRunningTime="2026-02-01 06:51:18.144970402 +0000 UTC m=+508.795906418" Feb 01 06:51:19 crc kubenswrapper[4546]: I0201 06:51:19.120232 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" event={"ID":"3950d2e9-eda7-4334-a1e1-1513629c662c","Type":"ContainerStarted","Data":"6fc888dfb0627357afc9f82f02f3b61c127c35205f58be0f63c3ba721ec5a902"} Feb 01 06:51:19 crc kubenswrapper[4546]: I0201 06:51:19.141683 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-kr6pp" podStartSLOduration=2.035839343 podStartE2EDuration="6.141662993s" podCreationTimestamp="2026-02-01 06:51:13 +0000 UTC" firstStartedPulling="2026-02-01 06:51:14.53593803 +0000 UTC m=+505.186874047" lastFinishedPulling="2026-02-01 06:51:18.641761681 +0000 UTC m=+509.292697697" observedRunningTime="2026-02-01 06:51:19.135053444 +0000 UTC m=+509.785989459" watchObservedRunningTime="2026-02-01 06:51:19.141662993 +0000 UTC m=+509.792599009" Feb 01 06:51:20 crc kubenswrapper[4546]: I0201 06:51:20.130340 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" event={"ID":"916742b1-ba9e-4c96-9b2a-90a93854b8a6","Type":"ContainerStarted","Data":"cc58532bc48d70f1b00a9d383d1afd838b1fa621f3c0dfda6c46c0122a2cd7ca"} Feb 01 06:51:21 crc kubenswrapper[4546]: I0201 06:51:21.138887 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" event={"ID":"916742b1-ba9e-4c96-9b2a-90a93854b8a6","Type":"ContainerStarted","Data":"cfaae591e6928499e15458b74ad409dbd55911220751f1a09bf6596129065e9b"} Feb 01 06:51:21 crc kubenswrapper[4546]: I0201 06:51:21.160521 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-245tj" podStartSLOduration=1.7642838410000001 podStartE2EDuration="8.160499685s" podCreationTimestamp="2026-02-01 06:51:13 +0000 UTC" firstStartedPulling="2026-02-01 06:51:14.446741428 +0000 UTC m=+505.097677445" lastFinishedPulling="2026-02-01 06:51:20.842957283 +0000 UTC m=+511.493893289" observedRunningTime="2026-02-01 06:51:21.15792481 +0000 UTC m=+511.808860816" watchObservedRunningTime="2026-02-01 06:51:21.160499685 +0000 UTC m=+511.811435690" Feb 01 06:51:24 crc kubenswrapper[4546]: I0201 06:51:24.258487 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wst5k" Feb 01 06:51:24 crc kubenswrapper[4546]: I0201 06:51:24.552549 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:24 crc kubenswrapper[4546]: I0201 06:51:24.552623 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:24 crc kubenswrapper[4546]: I0201 06:51:24.556727 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:25 crc kubenswrapper[4546]: I0201 06:51:25.166265 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f68d6bfd5-njtfx" Feb 01 06:51:25 crc kubenswrapper[4546]: I0201 06:51:25.204584 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:51:25 crc kubenswrapper[4546]: I0201 06:51:25.420341 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:51:25 crc kubenswrapper[4546]: I0201 06:51:25.420987 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:51:34 crc kubenswrapper[4546]: I0201 06:51:34.211966 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-ftxkp" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.360260 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs"] Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.361969 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.364043 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.367217 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs"] Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.517527 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sl6\" (UniqueName: \"kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.517585 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.517693 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.618476 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8sl6\" (UniqueName: \"kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.618798 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.619233 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.619310 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.619614 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.639039 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8sl6\" (UniqueName: \"kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:44 crc kubenswrapper[4546]: I0201 06:51:44.679351 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:45 crc kubenswrapper[4546]: I0201 06:51:45.032954 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs"] Feb 01 06:51:45 crc kubenswrapper[4546]: I0201 06:51:45.291558 4546 generic.go:334] "Generic (PLEG): container finished" podID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerID="1f21b81565130992daed255d81281bee103ebb89ba6315b2a25f840310aeded1" exitCode=0 Feb 01 06:51:45 crc kubenswrapper[4546]: I0201 06:51:45.291618 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" event={"ID":"7dabecce-f2ee-4689-91d3-090ff64e5a2c","Type":"ContainerDied","Data":"1f21b81565130992daed255d81281bee103ebb89ba6315b2a25f840310aeded1"} Feb 01 06:51:45 crc kubenswrapper[4546]: I0201 06:51:45.291930 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" event={"ID":"7dabecce-f2ee-4689-91d3-090ff64e5a2c","Type":"ContainerStarted","Data":"646ac2f219323d00f2f1c4de40aa52c0bdf3961e5883fc345b4785dd1e08a188"} Feb 01 06:51:47 crc kubenswrapper[4546]: I0201 06:51:47.305936 4546 generic.go:334] "Generic (PLEG): container finished" podID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerID="337e259112534d0a445854c3c4c9ef33a12e64e38051e4e4a39138a2954c2039" exitCode=0 Feb 01 06:51:47 crc kubenswrapper[4546]: I0201 06:51:47.306022 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" event={"ID":"7dabecce-f2ee-4689-91d3-090ff64e5a2c","Type":"ContainerDied","Data":"337e259112534d0a445854c3c4c9ef33a12e64e38051e4e4a39138a2954c2039"} Feb 01 06:51:48 crc kubenswrapper[4546]: I0201 06:51:48.315849 4546 generic.go:334] "Generic (PLEG): container finished" podID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerID="a48ec272cfc5d2f8fb737316a76ab57abb8a75289bf246ad82b5ed08c5dfabc9" exitCode=0 Feb 01 06:51:48 crc kubenswrapper[4546]: I0201 06:51:48.315899 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" event={"ID":"7dabecce-f2ee-4689-91d3-090ff64e5a2c","Type":"ContainerDied","Data":"a48ec272cfc5d2f8fb737316a76ab57abb8a75289bf246ad82b5ed08c5dfabc9"} Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.539371 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.683253 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle\") pod \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.683357 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util\") pod \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.683384 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8sl6\" (UniqueName: \"kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6\") pod \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\" (UID: \"7dabecce-f2ee-4689-91d3-090ff64e5a2c\") " Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.684099 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle" (OuterVolumeSpecName: "bundle") pod "7dabecce-f2ee-4689-91d3-090ff64e5a2c" (UID: "7dabecce-f2ee-4689-91d3-090ff64e5a2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.688733 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6" (OuterVolumeSpecName: "kube-api-access-q8sl6") pod "7dabecce-f2ee-4689-91d3-090ff64e5a2c" (UID: "7dabecce-f2ee-4689-91d3-090ff64e5a2c"). InnerVolumeSpecName "kube-api-access-q8sl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.694940 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util" (OuterVolumeSpecName: "util") pod "7dabecce-f2ee-4689-91d3-090ff64e5a2c" (UID: "7dabecce-f2ee-4689-91d3-090ff64e5a2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.784942 4546 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.784971 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8sl6\" (UniqueName: \"kubernetes.io/projected/7dabecce-f2ee-4689-91d3-090ff64e5a2c-kube-api-access-q8sl6\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:49 crc kubenswrapper[4546]: I0201 06:51:49.784986 4546 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7dabecce-f2ee-4689-91d3-090ff64e5a2c-util\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.235451 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8659n" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" containerID="cri-o://d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74" gracePeriod=15 Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.328488 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" event={"ID":"7dabecce-f2ee-4689-91d3-090ff64e5a2c","Type":"ContainerDied","Data":"646ac2f219323d00f2f1c4de40aa52c0bdf3961e5883fc345b4785dd1e08a188"} Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.328535 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="646ac2f219323d00f2f1c4de40aa52c0bdf3961e5883fc345b4785dd1e08a188" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.328555 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcskkvs" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.537238 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8659n_81d1f1d9-4f02-4d8e-946c-9cc1592090ae/console/0.log" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.537355 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694264 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694330 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694368 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694390 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694426 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694452 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.694477 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzvz8\" (UniqueName: \"kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8\") pod \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\" (UID: \"81d1f1d9-4f02-4d8e-946c-9cc1592090ae\") " Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.695209 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.695263 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config" (OuterVolumeSpecName: "console-config") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.695280 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.695531 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.699236 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.699675 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.700514 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8" (OuterVolumeSpecName: "kube-api-access-nzvz8") pod "81d1f1d9-4f02-4d8e-946c-9cc1592090ae" (UID: "81d1f1d9-4f02-4d8e-946c-9cc1592090ae"). InnerVolumeSpecName "kube-api-access-nzvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795749 4546 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795791 4546 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795804 4546 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-console-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795816 4546 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795828 4546 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795839 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzvz8\" (UniqueName: \"kubernetes.io/projected/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-kube-api-access-nzvz8\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:50 crc kubenswrapper[4546]: I0201 06:51:50.795870 4546 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81d1f1d9-4f02-4d8e-946c-9cc1592090ae-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336101 4546 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8659n_81d1f1d9-4f02-4d8e-946c-9cc1592090ae/console/0.log" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336435 4546 generic.go:334] "Generic (PLEG): container finished" podID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerID="d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74" exitCode=2 Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336498 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8659n" event={"ID":"81d1f1d9-4f02-4d8e-946c-9cc1592090ae","Type":"ContainerDied","Data":"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74"} Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336539 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8659n" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336566 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8659n" event={"ID":"81d1f1d9-4f02-4d8e-946c-9cc1592090ae","Type":"ContainerDied","Data":"95420f6555dc29b56292435133e15c3332a4bb93d7a736c9e0311ddc7c29afdd"} Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.336626 4546 scope.go:117] "RemoveContainer" containerID="d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.351700 4546 scope.go:117] "RemoveContainer" containerID="d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74" Feb 01 06:51:51 crc kubenswrapper[4546]: E0201 06:51:51.352340 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74\": container with ID starting with d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74 not found: ID does not exist" containerID="d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.352415 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74"} err="failed to get container status \"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74\": rpc error: code = NotFound desc = could not find container \"d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74\": container with ID starting with d917f9f4c3e9f3e758366a74a6197a20737d2fa68cea3bd6dd95df778fc4ac74 not found: ID does not exist" Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.365499 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.367251 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8659n"] Feb 01 06:51:51 crc kubenswrapper[4546]: I0201 06:51:51.662600 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" path="/var/lib/kubelet/pods/81d1f1d9-4f02-4d8e-946c-9cc1592090ae/volumes" Feb 01 06:51:55 crc kubenswrapper[4546]: I0201 06:51:55.421043 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:51:55 crc kubenswrapper[4546]: I0201 06:51:55.421399 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.741677 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8"] Feb 01 06:51:59 crc kubenswrapper[4546]: E0201 06:51:59.742109 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="util" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742122 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="util" Feb 01 06:51:59 crc kubenswrapper[4546]: E0201 06:51:59.742133 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="pull" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742139 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="pull" Feb 01 06:51:59 crc kubenswrapper[4546]: E0201 06:51:59.742149 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="extract" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742154 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="extract" Feb 01 06:51:59 crc kubenswrapper[4546]: E0201 06:51:59.742163 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742167 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742250 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dabecce-f2ee-4689-91d3-090ff64e5a2c" containerName="extract" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742257 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d1f1d9-4f02-4d8e-946c-9cc1592090ae" containerName="console" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.742571 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.744972 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.745347 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9lx27" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.745655 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.745868 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.745974 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.765132 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8"] Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.817464 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftq7\" (UniqueName: \"kubernetes.io/projected/1c46877b-6f03-479e-a336-5e5ae48441bf-kube-api-access-wftq7\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.817524 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-webhook-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.817649 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-apiservice-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.896657 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2"] Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.897575 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.898842 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qfm5n" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.899627 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.899998 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.911283 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2"] Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918507 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968cx\" (UniqueName: \"kubernetes.io/projected/8ccad9e7-12f1-4db3-9240-ead0c5197c08-kube-api-access-968cx\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918564 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-apiservice-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918593 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftq7\" (UniqueName: \"kubernetes.io/projected/1c46877b-6f03-479e-a336-5e5ae48441bf-kube-api-access-wftq7\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918614 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-webhook-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918630 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-apiservice-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.918667 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-webhook-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.924075 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-webhook-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.928369 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c46877b-6f03-479e-a336-5e5ae48441bf-apiservice-cert\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:51:59 crc kubenswrapper[4546]: I0201 06:51:59.944292 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftq7\" (UniqueName: \"kubernetes.io/projected/1c46877b-6f03-479e-a336-5e5ae48441bf-kube-api-access-wftq7\") pod \"metallb-operator-controller-manager-8788c77f7-gwnj8\" (UID: \"1c46877b-6f03-479e-a336-5e5ae48441bf\") " pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.019776 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-webhook-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.020787 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-apiservice-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.021039 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968cx\" (UniqueName: \"kubernetes.io/projected/8ccad9e7-12f1-4db3-9240-ead0c5197c08-kube-api-access-968cx\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.023427 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-webhook-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.023509 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccad9e7-12f1-4db3-9240-ead0c5197c08-apiservice-cert\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.037815 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968cx\" (UniqueName: \"kubernetes.io/projected/8ccad9e7-12f1-4db3-9240-ead0c5197c08-kube-api-access-968cx\") pod \"metallb-operator-webhook-server-7f87946b68-8khq2\" (UID: \"8ccad9e7-12f1-4db3-9240-ead0c5197c08\") " pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.058314 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.210391 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.405508 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2"] Feb 01 06:52:00 crc kubenswrapper[4546]: W0201 06:52:00.414956 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ccad9e7_12f1_4db3_9240_ead0c5197c08.slice/crio-245c381c0240c5c89081caffa864380f0ab9ee5df00bf7d3d987609d6eb397ef WatchSource:0}: Error finding container 245c381c0240c5c89081caffa864380f0ab9ee5df00bf7d3d987609d6eb397ef: Status 404 returned error can't find the container with id 245c381c0240c5c89081caffa864380f0ab9ee5df00bf7d3d987609d6eb397ef Feb 01 06:52:00 crc kubenswrapper[4546]: I0201 06:52:00.446703 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8"] Feb 01 06:52:00 crc kubenswrapper[4546]: W0201 06:52:00.459845 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c46877b_6f03_479e_a336_5e5ae48441bf.slice/crio-6694ff8c66893b058c6adfeec238b42bc9977679827802146639e8337ae254e3 WatchSource:0}: Error finding container 6694ff8c66893b058c6adfeec238b42bc9977679827802146639e8337ae254e3: Status 404 returned error can't find the container with id 6694ff8c66893b058c6adfeec238b42bc9977679827802146639e8337ae254e3 Feb 01 06:52:01 crc kubenswrapper[4546]: I0201 06:52:01.396797 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" event={"ID":"1c46877b-6f03-479e-a336-5e5ae48441bf","Type":"ContainerStarted","Data":"6694ff8c66893b058c6adfeec238b42bc9977679827802146639e8337ae254e3"} Feb 01 06:52:01 crc kubenswrapper[4546]: I0201 06:52:01.398524 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" event={"ID":"8ccad9e7-12f1-4db3-9240-ead0c5197c08","Type":"ContainerStarted","Data":"245c381c0240c5c89081caffa864380f0ab9ee5df00bf7d3d987609d6eb397ef"} Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.433918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" event={"ID":"8ccad9e7-12f1-4db3-9240-ead0c5197c08","Type":"ContainerStarted","Data":"0900f74cf2f3eea3ac3e324957374bcb04140d24adec09d2d714e8108550d591"} Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.434892 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.436079 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" event={"ID":"1c46877b-6f03-479e-a336-5e5ae48441bf","Type":"ContainerStarted","Data":"354e4d1874ff8491ce7163f8a9e443de8ed8c4298a19cd8945fda2b3568daeeb"} Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.436271 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.456432 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" podStartSLOduration=1.945143924 podStartE2EDuration="7.456422648s" podCreationTimestamp="2026-02-01 06:51:59 +0000 UTC" firstStartedPulling="2026-02-01 06:52:00.417510223 +0000 UTC m=+551.068446239" lastFinishedPulling="2026-02-01 06:52:05.928788948 +0000 UTC m=+556.579724963" observedRunningTime="2026-02-01 06:52:06.451836393 +0000 UTC m=+557.102772399" watchObservedRunningTime="2026-02-01 06:52:06.456422648 +0000 UTC m=+557.107358664" Feb 01 06:52:06 crc kubenswrapper[4546]: I0201 06:52:06.480159 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" podStartSLOduration=2.019091841 podStartE2EDuration="7.480124438s" podCreationTimestamp="2026-02-01 06:51:59 +0000 UTC" firstStartedPulling="2026-02-01 06:52:00.462921757 +0000 UTC m=+551.113857773" lastFinishedPulling="2026-02-01 06:52:05.923954354 +0000 UTC m=+556.574890370" observedRunningTime="2026-02-01 06:52:06.475640606 +0000 UTC m=+557.126576621" watchObservedRunningTime="2026-02-01 06:52:06.480124438 +0000 UTC m=+557.131060454" Feb 01 06:52:20 crc kubenswrapper[4546]: I0201 06:52:20.215476 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f87946b68-8khq2" Feb 01 06:52:25 crc kubenswrapper[4546]: I0201 06:52:25.421156 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:52:25 crc kubenswrapper[4546]: I0201 06:52:25.421441 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:52:25 crc kubenswrapper[4546]: I0201 06:52:25.421482 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:52:25 crc kubenswrapper[4546]: I0201 06:52:25.422070 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:52:25 crc kubenswrapper[4546]: I0201 06:52:25.422141 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c" gracePeriod=600 Feb 01 06:52:26 crc kubenswrapper[4546]: I0201 06:52:26.548533 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c" exitCode=0 Feb 01 06:52:26 crc kubenswrapper[4546]: I0201 06:52:26.548612 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c"} Feb 01 06:52:26 crc kubenswrapper[4546]: I0201 06:52:26.549249 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e"} Feb 01 06:52:26 crc kubenswrapper[4546]: I0201 06:52:26.549277 4546 scope.go:117] "RemoveContainer" containerID="fe514fb7e5a4706637156f35a07f75c3df77c458aae7b607aeb24537d931b4e3" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.062642 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-8788c77f7-gwnj8" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.602491 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wzqbx"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.605422 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.610591 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6pxks" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.611954 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.612212 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.618699 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.619601 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.620933 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.636306 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658346 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fv8l\" (UniqueName: \"kubernetes.io/projected/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-kube-api-access-2fv8l\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658750 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-conf\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658773 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mng8p\" (UniqueName: \"kubernetes.io/projected/1b76a263-eee7-4bf8-90df-e61731efa91f-kube-api-access-mng8p\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658825 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-startup\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658899 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658936 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-reloader\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.658972 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-sockets\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.659010 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.659046 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b76a263-eee7-4bf8-90df-e61731efa91f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.705999 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-x85bn"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.707488 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.710839 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lrszf" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.711360 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.711545 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.717363 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.738225 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-wvbs6"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.739763 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.742812 4546 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.755512 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wvbs6"] Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760281 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760346 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fv8l\" (UniqueName: \"kubernetes.io/projected/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-kube-api-access-2fv8l\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760374 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1720f98b-a156-436f-9922-8da0c3f5d3d7-metallb-excludel2\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760397 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-conf\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760427 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mng8p\" (UniqueName: \"kubernetes.io/projected/1b76a263-eee7-4bf8-90df-e61731efa91f-kube-api-access-mng8p\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760464 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-startup\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760487 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760517 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-reloader\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760539 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760562 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-sockets\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760581 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-cert\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760601 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5x68\" (UniqueName: \"kubernetes.io/projected/1720f98b-a156-436f-9922-8da0c3f5d3d7-kube-api-access-p5x68\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760625 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760645 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-metrics-certs\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760666 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b76a263-eee7-4bf8-90df-e61731efa91f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.760691 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sssrb\" (UniqueName: \"kubernetes.io/projected/b39966e3-2127-4be1-848d-ba96262c7e74-kube-api-access-sssrb\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.761026 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-reloader\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.761122 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-sockets\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.761319 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-conf\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.761968 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.762102 4546 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.762180 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs podName:5bf15ac8-d06d-40f9-83b2-c811d3b6d47b nodeName:}" failed. No retries permitted until 2026-02-01 06:52:41.262157807 +0000 UTC m=+591.913093823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs") pod "frr-k8s-wzqbx" (UID: "5bf15ac8-d06d-40f9-83b2-c811d3b6d47b") : secret "frr-k8s-certs-secret" not found Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.762306 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-frr-startup\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.772849 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b76a263-eee7-4bf8-90df-e61731efa91f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.775173 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fv8l\" (UniqueName: \"kubernetes.io/projected/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-kube-api-access-2fv8l\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.777916 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mng8p\" (UniqueName: \"kubernetes.io/projected/1b76a263-eee7-4bf8-90df-e61731efa91f-kube-api-access-mng8p\") pod \"frr-k8s-webhook-server-7df86c4f6c-gvwt6\" (UID: \"1b76a263-eee7-4bf8-90df-e61731efa91f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.862433 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.862553 4546 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.862613 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs podName:b39966e3-2127-4be1-848d-ba96262c7e74 nodeName:}" failed. No retries permitted until 2026-02-01 06:52:41.362597982 +0000 UTC m=+592.013533998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs") pod "controller-6968d8fdc4-wvbs6" (UID: "b39966e3-2127-4be1-848d-ba96262c7e74") : secret "controller-certs-secret" not found Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.862840 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-cert\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.863259 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5x68\" (UniqueName: \"kubernetes.io/projected/1720f98b-a156-436f-9922-8da0c3f5d3d7-kube-api-access-p5x68\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.863299 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-metrics-certs\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.863336 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sssrb\" (UniqueName: \"kubernetes.io/projected/b39966e3-2127-4be1-848d-ba96262c7e74-kube-api-access-sssrb\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.863373 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.863436 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1720f98b-a156-436f-9922-8da0c3f5d3d7-metallb-excludel2\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.863726 4546 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 01 06:52:40 crc kubenswrapper[4546]: E0201 06:52:40.863788 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist podName:1720f98b-a156-436f-9922-8da0c3f5d3d7 nodeName:}" failed. No retries permitted until 2026-02-01 06:52:41.363770141 +0000 UTC m=+592.014706157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist") pod "speaker-x85bn" (UID: "1720f98b-a156-436f-9922-8da0c3f5d3d7") : secret "metallb-memberlist" not found Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.864093 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1720f98b-a156-436f-9922-8da0c3f5d3d7-metallb-excludel2\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.866363 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-cert\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.875329 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-metrics-certs\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.876168 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sssrb\" (UniqueName: \"kubernetes.io/projected/b39966e3-2127-4be1-848d-ba96262c7e74-kube-api-access-sssrb\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.892031 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5x68\" (UniqueName: \"kubernetes.io/projected/1720f98b-a156-436f-9922-8da0c3f5d3d7-kube-api-access-p5x68\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:40 crc kubenswrapper[4546]: I0201 06:52:40.933274 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.118547 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6"] Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.267804 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.271638 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bf15ac8-d06d-40f9-83b2-c811d3b6d47b-metrics-certs\") pod \"frr-k8s-wzqbx\" (UID: \"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b\") " pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.369149 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:41 crc kubenswrapper[4546]: E0201 06:52:41.369285 4546 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.369297 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:41 crc kubenswrapper[4546]: E0201 06:52:41.369352 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist podName:1720f98b-a156-436f-9922-8da0c3f5d3d7 nodeName:}" failed. No retries permitted until 2026-02-01 06:52:42.369334592 +0000 UTC m=+593.020270609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist") pod "speaker-x85bn" (UID: "1720f98b-a156-436f-9922-8da0c3f5d3d7") : secret "metallb-memberlist" not found Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.372362 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b39966e3-2127-4be1-848d-ba96262c7e74-metrics-certs\") pod \"controller-6968d8fdc4-wvbs6\" (UID: \"b39966e3-2127-4be1-848d-ba96262c7e74\") " pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.524969 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.632937 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"f15c6c819ebd288b0ec4ca82988c2be5c054176acefbee9ece904c1494faa3c2"} Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.634027 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" event={"ID":"1b76a263-eee7-4bf8-90df-e61731efa91f","Type":"ContainerStarted","Data":"6f09b337ecd8cf851833e6147d91687f81c4ecb62103b9d880233d6796cb3c5d"} Feb 01 06:52:41 crc kubenswrapper[4546]: I0201 06:52:41.652041 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.042238 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wvbs6"] Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.386594 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.396458 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1720f98b-a156-436f-9922-8da0c3f5d3d7-memberlist\") pod \"speaker-x85bn\" (UID: \"1720f98b-a156-436f-9922-8da0c3f5d3d7\") " pod="metallb-system/speaker-x85bn" Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.520537 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-x85bn" Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.643125 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x85bn" event={"ID":"1720f98b-a156-436f-9922-8da0c3f5d3d7","Type":"ContainerStarted","Data":"0f9447b9bd8f3654481d4e2953f897ef2fda0c7f33fbacc14c055b629feb2ff1"} Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.647421 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wvbs6" event={"ID":"b39966e3-2127-4be1-848d-ba96262c7e74","Type":"ContainerStarted","Data":"50c187124f51b740fb62d7608a205dbeb90b1ea7368e56092f7a287973dc34cd"} Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.647449 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wvbs6" event={"ID":"b39966e3-2127-4be1-848d-ba96262c7e74","Type":"ContainerStarted","Data":"2a7237982534fc7b3d9827aa658d1c4a08fa5a239fbccecc8c2a035ca476e4cc"} Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.647461 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wvbs6" event={"ID":"b39966e3-2127-4be1-848d-ba96262c7e74","Type":"ContainerStarted","Data":"e4a00579b5c2179841434f19ed4588dad4464fdfe05b3b0d9b7d4578d8ceae14"} Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.647826 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:52:42 crc kubenswrapper[4546]: I0201 06:52:42.676283 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-wvbs6" podStartSLOduration=2.676264635 podStartE2EDuration="2.676264635s" podCreationTimestamp="2026-02-01 06:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:52:42.67339215 +0000 UTC m=+593.324328167" watchObservedRunningTime="2026-02-01 06:52:42.676264635 +0000 UTC m=+593.327200650" Feb 01 06:52:43 crc kubenswrapper[4546]: I0201 06:52:43.670654 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x85bn" event={"ID":"1720f98b-a156-436f-9922-8da0c3f5d3d7","Type":"ContainerStarted","Data":"d69c8f9dccd11d3409f68aeb91d3cde784b986b305217fa5fc2b1572a4a82130"} Feb 01 06:52:43 crc kubenswrapper[4546]: I0201 06:52:43.671013 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-x85bn" event={"ID":"1720f98b-a156-436f-9922-8da0c3f5d3d7","Type":"ContainerStarted","Data":"61187672288f0098a2666b1ec602abf9523ea616ee35f7b68da3ecd946fb5635"} Feb 01 06:52:43 crc kubenswrapper[4546]: I0201 06:52:43.677441 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-x85bn" podStartSLOduration=3.677423665 podStartE2EDuration="3.677423665s" podCreationTimestamp="2026-02-01 06:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:52:43.672258778 +0000 UTC m=+594.323194794" watchObservedRunningTime="2026-02-01 06:52:43.677423665 +0000 UTC m=+594.328359671" Feb 01 06:52:44 crc kubenswrapper[4546]: I0201 06:52:44.660348 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-x85bn" Feb 01 06:52:48 crc kubenswrapper[4546]: I0201 06:52:48.686091 4546 generic.go:334] "Generic (PLEG): container finished" podID="5bf15ac8-d06d-40f9-83b2-c811d3b6d47b" containerID="c9a9c42ba182a0abf37430df333dd6dbb96275028c362c31b41d199f25551cb8" exitCode=0 Feb 01 06:52:48 crc kubenswrapper[4546]: I0201 06:52:48.686201 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerDied","Data":"c9a9c42ba182a0abf37430df333dd6dbb96275028c362c31b41d199f25551cb8"} Feb 01 06:52:48 crc kubenswrapper[4546]: I0201 06:52:48.690416 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" event={"ID":"1b76a263-eee7-4bf8-90df-e61731efa91f","Type":"ContainerStarted","Data":"5a441b70d41a913e8d03177d7b76e4cb82930a479cff3e9adf70acbe4868c404"} Feb 01 06:52:48 crc kubenswrapper[4546]: I0201 06:52:48.690748 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:52:48 crc kubenswrapper[4546]: I0201 06:52:48.732841 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" podStartSLOduration=1.420193353 podStartE2EDuration="8.732828108s" podCreationTimestamp="2026-02-01 06:52:40 +0000 UTC" firstStartedPulling="2026-02-01 06:52:41.140211106 +0000 UTC m=+591.791147111" lastFinishedPulling="2026-02-01 06:52:48.45284586 +0000 UTC m=+599.103781866" observedRunningTime="2026-02-01 06:52:48.72530606 +0000 UTC m=+599.376242076" watchObservedRunningTime="2026-02-01 06:52:48.732828108 +0000 UTC m=+599.383764115" Feb 01 06:52:49 crc kubenswrapper[4546]: I0201 06:52:49.699472 4546 generic.go:334] "Generic (PLEG): container finished" podID="5bf15ac8-d06d-40f9-83b2-c811d3b6d47b" containerID="b0bde8958cf61c76727e065e73b41ae4932c8b3a9d729a08f53587f5bd458e2f" exitCode=0 Feb 01 06:52:49 crc kubenswrapper[4546]: I0201 06:52:49.700650 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerDied","Data":"b0bde8958cf61c76727e065e73b41ae4932c8b3a9d729a08f53587f5bd458e2f"} Feb 01 06:52:50 crc kubenswrapper[4546]: I0201 06:52:50.707256 4546 generic.go:334] "Generic (PLEG): container finished" podID="5bf15ac8-d06d-40f9-83b2-c811d3b6d47b" containerID="941c98bc97e8d6e2b8ba7461fd0d6a275db78d686a61c84936755886182e8440" exitCode=0 Feb 01 06:52:50 crc kubenswrapper[4546]: I0201 06:52:50.707580 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerDied","Data":"941c98bc97e8d6e2b8ba7461fd0d6a275db78d686a61c84936755886182e8440"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719224 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"bad77e1c2a22a9f86d8d42f4dcf1df3aba3e2461747b78cff084a608f58e81a4"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719554 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"3059adb42230d13a8edcee322713f5f8bd39cd7a20653c87377c5905a67eff7a"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719575 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719586 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"c0b8dc5c5c060d3b4951879d2aeef26c957c183cc51e086ba44737d9648f4e61"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719597 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"a72ff27e82534a8fb890312da6ae706e1e5af2cdcefabb355c4acc4e62910051"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719608 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"f2294da620eb451a43aa0ffd595e838096de15a471ab7250939bdfbec8624488"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.719616 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wzqbx" event={"ID":"5bf15ac8-d06d-40f9-83b2-c811d3b6d47b","Type":"ContainerStarted","Data":"24b17cdf5715dfab642d88be1bc49f445e12381bd9a5f7d541077eb73f329410"} Feb 01 06:52:51 crc kubenswrapper[4546]: I0201 06:52:51.752912 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wzqbx" podStartSLOduration=4.921078388 podStartE2EDuration="11.752892315s" podCreationTimestamp="2026-02-01 06:52:40 +0000 UTC" firstStartedPulling="2026-02-01 06:52:41.615228935 +0000 UTC m=+592.266164950" lastFinishedPulling="2026-02-01 06:52:48.44704287 +0000 UTC m=+599.097978877" observedRunningTime="2026-02-01 06:52:51.747130875 +0000 UTC m=+602.398066891" watchObservedRunningTime="2026-02-01 06:52:51.752892315 +0000 UTC m=+602.403828331" Feb 01 06:52:52 crc kubenswrapper[4546]: I0201 06:52:52.526639 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-x85bn" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.484839 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.485483 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.487220 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hnpln" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.487452 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.487792 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.496108 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.578363 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcz4\" (UniqueName: \"kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4\") pod \"openstack-operator-index-fqfhw\" (UID: \"98cee7d0-9a89-432c-9de6-3a29fe637cef\") " pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.679256 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcz4\" (UniqueName: \"kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4\") pod \"openstack-operator-index-fqfhw\" (UID: \"98cee7d0-9a89-432c-9de6-3a29fe637cef\") " pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.696315 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcz4\" (UniqueName: \"kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4\") pod \"openstack-operator-index-fqfhw\" (UID: \"98cee7d0-9a89-432c-9de6-3a29fe637cef\") " pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:54 crc kubenswrapper[4546]: I0201 06:52:54.805230 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:55 crc kubenswrapper[4546]: I0201 06:52:55.204289 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:52:55 crc kubenswrapper[4546]: I0201 06:52:55.744915 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqfhw" event={"ID":"98cee7d0-9a89-432c-9de6-3a29fe637cef","Type":"ContainerStarted","Data":"9498bf6e63bd92ac9680a26b7be9c107af6cae1c3e3ae42fc7ece16671955add"} Feb 01 06:52:56 crc kubenswrapper[4546]: I0201 06:52:56.525931 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:56 crc kubenswrapper[4546]: I0201 06:52:56.559655 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:52:56 crc kubenswrapper[4546]: I0201 06:52:56.755345 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqfhw" event={"ID":"98cee7d0-9a89-432c-9de6-3a29fe637cef","Type":"ContainerStarted","Data":"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4"} Feb 01 06:52:57 crc kubenswrapper[4546]: I0201 06:52:57.868635 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fqfhw" podStartSLOduration=2.521692397 podStartE2EDuration="3.868608345s" podCreationTimestamp="2026-02-01 06:52:54 +0000 UTC" firstStartedPulling="2026-02-01 06:52:55.208764159 +0000 UTC m=+605.859700175" lastFinishedPulling="2026-02-01 06:52:56.555680106 +0000 UTC m=+607.206616123" observedRunningTime="2026-02-01 06:52:56.789590798 +0000 UTC m=+607.440526814" watchObservedRunningTime="2026-02-01 06:52:57.868608345 +0000 UTC m=+608.519544371" Feb 01 06:52:57 crc kubenswrapper[4546]: I0201 06:52:57.872322 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.470077 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dbks6"] Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.470911 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.486641 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbks6"] Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.529972 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjmt\" (UniqueName: \"kubernetes.io/projected/34c00d06-5f1e-4cb7-8f0d-487b6f864905-kube-api-access-grjmt\") pod \"openstack-operator-index-dbks6\" (UID: \"34c00d06-5f1e-4cb7-8f0d-487b6f864905\") " pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.630632 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjmt\" (UniqueName: \"kubernetes.io/projected/34c00d06-5f1e-4cb7-8f0d-487b6f864905-kube-api-access-grjmt\") pod \"openstack-operator-index-dbks6\" (UID: \"34c00d06-5f1e-4cb7-8f0d-487b6f864905\") " pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.658982 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjmt\" (UniqueName: \"kubernetes.io/projected/34c00d06-5f1e-4cb7-8f0d-487b6f864905-kube-api-access-grjmt\") pod \"openstack-operator-index-dbks6\" (UID: \"34c00d06-5f1e-4cb7-8f0d-487b6f864905\") " pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.766085 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-fqfhw" podUID="98cee7d0-9a89-432c-9de6-3a29fe637cef" containerName="registry-server" containerID="cri-o://b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4" gracePeriod=2 Feb 01 06:52:58 crc kubenswrapper[4546]: I0201 06:52:58.786696 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.231764 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.242238 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drcz4\" (UniqueName: \"kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4\") pod \"98cee7d0-9a89-432c-9de6-3a29fe637cef\" (UID: \"98cee7d0-9a89-432c-9de6-3a29fe637cef\") " Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.253326 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4" (OuterVolumeSpecName: "kube-api-access-drcz4") pod "98cee7d0-9a89-432c-9de6-3a29fe637cef" (UID: "98cee7d0-9a89-432c-9de6-3a29fe637cef"). InnerVolumeSpecName "kube-api-access-drcz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.343420 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drcz4\" (UniqueName: \"kubernetes.io/projected/98cee7d0-9a89-432c-9de6-3a29fe637cef-kube-api-access-drcz4\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.348770 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbks6"] Feb 01 06:52:59 crc kubenswrapper[4546]: W0201 06:52:59.351527 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c00d06_5f1e_4cb7_8f0d_487b6f864905.slice/crio-817cbc4fd4c808ff693dbdb899786c3d24261227c3830c2a2f35bfe77c49e007 WatchSource:0}: Error finding container 817cbc4fd4c808ff693dbdb899786c3d24261227c3830c2a2f35bfe77c49e007: Status 404 returned error can't find the container with id 817cbc4fd4c808ff693dbdb899786c3d24261227c3830c2a2f35bfe77c49e007 Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.774330 4546 generic.go:334] "Generic (PLEG): container finished" podID="98cee7d0-9a89-432c-9de6-3a29fe637cef" containerID="b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4" exitCode=0 Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.774374 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqfhw" event={"ID":"98cee7d0-9a89-432c-9de6-3a29fe637cef","Type":"ContainerDied","Data":"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4"} Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.774406 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fqfhw" event={"ID":"98cee7d0-9a89-432c-9de6-3a29fe637cef","Type":"ContainerDied","Data":"9498bf6e63bd92ac9680a26b7be9c107af6cae1c3e3ae42fc7ece16671955add"} Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.774359 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fqfhw" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.774436 4546 scope.go:117] "RemoveContainer" containerID="b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.777145 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbks6" event={"ID":"34c00d06-5f1e-4cb7-8f0d-487b6f864905","Type":"ContainerStarted","Data":"817cbc4fd4c808ff693dbdb899786c3d24261227c3830c2a2f35bfe77c49e007"} Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.794498 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.794560 4546 scope.go:117] "RemoveContainer" containerID="b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4" Feb 01 06:52:59 crc kubenswrapper[4546]: E0201 06:52:59.795058 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4\": container with ID starting with b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4 not found: ID does not exist" containerID="b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.795095 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4"} err="failed to get container status \"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4\": rpc error: code = NotFound desc = could not find container \"b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4\": container with ID starting with b242e0987355260aee4e7f8224a2a3ad25db7468741e641497230cdd319ea4f4 not found: ID does not exist" Feb 01 06:52:59 crc kubenswrapper[4546]: I0201 06:52:59.797187 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-fqfhw"] Feb 01 06:53:00 crc kubenswrapper[4546]: I0201 06:53:00.786897 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbks6" event={"ID":"34c00d06-5f1e-4cb7-8f0d-487b6f864905","Type":"ContainerStarted","Data":"618ff4ccba7188fbf7f7098aabf5d7aae2ac3b862efc3e7e1f6cad35100d7b39"} Feb 01 06:53:00 crc kubenswrapper[4546]: I0201 06:53:00.804833 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dbks6" podStartSLOduration=2.319033627 podStartE2EDuration="2.804812172s" podCreationTimestamp="2026-02-01 06:52:58 +0000 UTC" firstStartedPulling="2026-02-01 06:52:59.354911995 +0000 UTC m=+610.005848010" lastFinishedPulling="2026-02-01 06:52:59.840690539 +0000 UTC m=+610.491626555" observedRunningTime="2026-02-01 06:53:00.800598218 +0000 UTC m=+611.451534233" watchObservedRunningTime="2026-02-01 06:53:00.804812172 +0000 UTC m=+611.455748188" Feb 01 06:53:00 crc kubenswrapper[4546]: I0201 06:53:00.938741 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-gvwt6" Feb 01 06:53:01 crc kubenswrapper[4546]: I0201 06:53:01.596105 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wzqbx" Feb 01 06:53:01 crc kubenswrapper[4546]: I0201 06:53:01.660712 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cee7d0-9a89-432c-9de6-3a29fe637cef" path="/var/lib/kubelet/pods/98cee7d0-9a89-432c-9de6-3a29fe637cef/volumes" Feb 01 06:53:01 crc kubenswrapper[4546]: I0201 06:53:01.680261 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-wvbs6" Feb 01 06:53:08 crc kubenswrapper[4546]: I0201 06:53:08.787792 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:53:08 crc kubenswrapper[4546]: I0201 06:53:08.789152 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:53:08 crc kubenswrapper[4546]: I0201 06:53:08.817899 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:53:08 crc kubenswrapper[4546]: I0201 06:53:08.852072 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dbks6" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.230249 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf"] Feb 01 06:53:15 crc kubenswrapper[4546]: E0201 06:53:15.231497 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cee7d0-9a89-432c-9de6-3a29fe637cef" containerName="registry-server" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.231521 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cee7d0-9a89-432c-9de6-3a29fe637cef" containerName="registry-server" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.231688 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cee7d0-9a89-432c-9de6-3a29fe637cef" containerName="registry-server" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.233051 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.235300 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-rj8ng" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.238026 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf"] Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.428556 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.428697 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssqwq\" (UniqueName: \"kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.428816 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.529837 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssqwq\" (UniqueName: \"kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.529910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.530004 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.530487 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.530570 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.549129 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssqwq\" (UniqueName: \"kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.554838 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.766382 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf"] Feb 01 06:53:15 crc kubenswrapper[4546]: I0201 06:53:15.865500 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" event={"ID":"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927","Type":"ContainerStarted","Data":"1c1cb6cf5060d6ea6fa61ac4542b93df82490cd63aa9947cd8d2de75aca15911"} Feb 01 06:53:16 crc kubenswrapper[4546]: I0201 06:53:16.872914 4546 generic.go:334] "Generic (PLEG): container finished" podID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerID="727e6e8e862c9aa7e1490798081f67a677b1cc630e99bab5cacfd8fea2ff964e" exitCode=0 Feb 01 06:53:16 crc kubenswrapper[4546]: I0201 06:53:16.873014 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" event={"ID":"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927","Type":"ContainerDied","Data":"727e6e8e862c9aa7e1490798081f67a677b1cc630e99bab5cacfd8fea2ff964e"} Feb 01 06:53:18 crc kubenswrapper[4546]: I0201 06:53:18.886590 4546 generic.go:334] "Generic (PLEG): container finished" podID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerID="a7b735c9679802a20639c3370eeae6c0aeb72d15d583565e757c17929fca6ba7" exitCode=0 Feb 01 06:53:18 crc kubenswrapper[4546]: I0201 06:53:18.886643 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" event={"ID":"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927","Type":"ContainerDied","Data":"a7b735c9679802a20639c3370eeae6c0aeb72d15d583565e757c17929fca6ba7"} Feb 01 06:53:19 crc kubenswrapper[4546]: I0201 06:53:19.896029 4546 generic.go:334] "Generic (PLEG): container finished" podID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerID="4b9f8a51f55da27973528468f0e0da52bd19751e13dc79fbf0bc23e48f8db4b7" exitCode=0 Feb 01 06:53:19 crc kubenswrapper[4546]: I0201 06:53:19.896095 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" event={"ID":"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927","Type":"ContainerDied","Data":"4b9f8a51f55da27973528468f0e0da52bd19751e13dc79fbf0bc23e48f8db4b7"} Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.111286 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.308722 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle\") pod \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.308844 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util\") pod \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.309167 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssqwq\" (UniqueName: \"kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq\") pod \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\" (UID: \"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927\") " Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.309224 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle" (OuterVolumeSpecName: "bundle") pod "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" (UID: "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.309619 4546 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.314551 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq" (OuterVolumeSpecName: "kube-api-access-ssqwq") pod "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" (UID: "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927"). InnerVolumeSpecName "kube-api-access-ssqwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.411158 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssqwq\" (UniqueName: \"kubernetes.io/projected/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-kube-api-access-ssqwq\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.536203 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util" (OuterVolumeSpecName: "util") pod "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" (UID: "1aec36e4-2b24-4d4f-8e85-f6ee37fb3927"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.615022 4546 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1aec36e4-2b24-4d4f-8e85-f6ee37fb3927-util\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.920404 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" event={"ID":"1aec36e4-2b24-4d4f-8e85-f6ee37fb3927","Type":"ContainerDied","Data":"1c1cb6cf5060d6ea6fa61ac4542b93df82490cd63aa9947cd8d2de75aca15911"} Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.920448 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1cb6cf5060d6ea6fa61ac4542b93df82490cd63aa9947cd8d2de75aca15911" Feb 01 06:53:21 crc kubenswrapper[4546]: I0201 06:53:21.920508 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1qccdf" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.246269 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d"] Feb 01 06:53:27 crc kubenswrapper[4546]: E0201 06:53:27.247101 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="pull" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.247117 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="pull" Feb 01 06:53:27 crc kubenswrapper[4546]: E0201 06:53:27.247128 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="util" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.247134 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="util" Feb 01 06:53:27 crc kubenswrapper[4546]: E0201 06:53:27.247144 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="extract" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.247150 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="extract" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.247282 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aec36e4-2b24-4d4f-8e85-f6ee37fb3927" containerName="extract" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.247741 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.250056 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-x8jvp" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.265592 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d"] Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.283129 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cthqp\" (UniqueName: \"kubernetes.io/projected/fa42398b-8afc-455a-8bbc-22884d951c8b-kube-api-access-cthqp\") pod \"openstack-operator-controller-init-757f46c65d-9n24d\" (UID: \"fa42398b-8afc-455a-8bbc-22884d951c8b\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.384155 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cthqp\" (UniqueName: \"kubernetes.io/projected/fa42398b-8afc-455a-8bbc-22884d951c8b-kube-api-access-cthqp\") pod \"openstack-operator-controller-init-757f46c65d-9n24d\" (UID: \"fa42398b-8afc-455a-8bbc-22884d951c8b\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.403323 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cthqp\" (UniqueName: \"kubernetes.io/projected/fa42398b-8afc-455a-8bbc-22884d951c8b-kube-api-access-cthqp\") pod \"openstack-operator-controller-init-757f46c65d-9n24d\" (UID: \"fa42398b-8afc-455a-8bbc-22884d951c8b\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.562293 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.807811 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d"] Feb 01 06:53:27 crc kubenswrapper[4546]: I0201 06:53:27.951783 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" event={"ID":"fa42398b-8afc-455a-8bbc-22884d951c8b","Type":"ContainerStarted","Data":"b8c2b81bc5cd6df891134762c33e5c072782264c21275e0633959c2f2bbcaff4"} Feb 01 06:53:34 crc kubenswrapper[4546]: I0201 06:53:34.005913 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" event={"ID":"fa42398b-8afc-455a-8bbc-22884d951c8b","Type":"ContainerStarted","Data":"0754079ebf92eb43ae41264075a89d8f7656617b38cc1e5829ed9f7b801cba07"} Feb 01 06:53:34 crc kubenswrapper[4546]: I0201 06:53:34.007729 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:53:34 crc kubenswrapper[4546]: I0201 06:53:34.033271 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" podStartSLOduration=1.490505719 podStartE2EDuration="7.033242686s" podCreationTimestamp="2026-02-01 06:53:27 +0000 UTC" firstStartedPulling="2026-02-01 06:53:27.820384218 +0000 UTC m=+638.471320235" lastFinishedPulling="2026-02-01 06:53:33.363121186 +0000 UTC m=+644.014057202" observedRunningTime="2026-02-01 06:53:34.027755812 +0000 UTC m=+644.678691828" watchObservedRunningTime="2026-02-01 06:53:34.033242686 +0000 UTC m=+644.684178702" Feb 01 06:53:47 crc kubenswrapper[4546]: I0201 06:53:47.565607 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-9n24d" Feb 01 06:54:14 crc kubenswrapper[4546]: I0201 06:54:14.998525 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.001210 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.012093 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.012620 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.015064 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4bz7j" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.018579 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hh65p" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.031143 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.048990 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.049610 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.050955 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-lbwhz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.054344 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.068555 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.069520 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.074193 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c8krl" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.090788 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.091659 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.093214 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rxf2d" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.106619 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.109821 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.110816 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.112653 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.114230 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4ww5t" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.121033 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.128711 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-f524g"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.129287 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.133159 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kvvhc" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.133325 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.141578 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vlw\" (UniqueName: \"kubernetes.io/projected/7d68804a-07fb-4472-b820-b4a573c6fa5e-kube-api-access-42vlw\") pod \"horizon-operator-controller-manager-5fb775575f-vbzg9\" (UID: \"7d68804a-07fb-4472-b820-b4a573c6fa5e\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.141615 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxcj\" (UniqueName: \"kubernetes.io/projected/a4a51000-7d7d-4086-a484-bc1206e61efd-kube-api-access-4zxcj\") pod \"cinder-operator-controller-manager-8d874c8fc-nn6m6\" (UID: \"a4a51000-7d7d-4086-a484-bc1206e61efd\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.141665 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g42z\" (UniqueName: \"kubernetes.io/projected/f0af8317-a52f-4860-afb9-bf87fd8b5a9c-kube-api-access-8g42z\") pod \"glance-operator-controller-manager-8886f4c47-tfhhx\" (UID: \"f0af8317-a52f-4860-afb9-bf87fd8b5a9c\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.141683 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fslc\" (UniqueName: \"kubernetes.io/projected/1730fca3-3542-4276-9206-d786273fbbcf-kube-api-access-7fslc\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-tbjxg\" (UID: \"1730fca3-3542-4276-9206-d786273fbbcf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.142912 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-f524g"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.163918 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.164650 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.167144 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-c7tv5" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.170910 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.171416 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.173004 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mmznt" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.176319 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.185332 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.229484 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.246520 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4fj\" (UniqueName: \"kubernetes.io/projected/1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200-kube-api-access-rp4fj\") pod \"heat-operator-controller-manager-69d6db494d-lhdxk\" (UID: \"1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.246661 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.246740 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6d4g\" (UniqueName: \"kubernetes.io/projected/3b81f7fa-1d81-4157-bcda-fa30d8c904d0-kube-api-access-c6d4g\") pod \"designate-operator-controller-manager-6d9697b7f4-n5srj\" (UID: \"3b81f7fa-1d81-4157-bcda-fa30d8c904d0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.246821 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g42z\" (UniqueName: \"kubernetes.io/projected/f0af8317-a52f-4860-afb9-bf87fd8b5a9c-kube-api-access-8g42z\") pod \"glance-operator-controller-manager-8886f4c47-tfhhx\" (UID: \"f0af8317-a52f-4860-afb9-bf87fd8b5a9c\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.246923 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fslc\" (UniqueName: \"kubernetes.io/projected/1730fca3-3542-4276-9206-d786273fbbcf-kube-api-access-7fslc\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-tbjxg\" (UID: \"1730fca3-3542-4276-9206-d786273fbbcf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.247247 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tk56\" (UniqueName: \"kubernetes.io/projected/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-kube-api-access-4tk56\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.247506 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vlw\" (UniqueName: \"kubernetes.io/projected/7d68804a-07fb-4472-b820-b4a573c6fa5e-kube-api-access-42vlw\") pod \"horizon-operator-controller-manager-5fb775575f-vbzg9\" (UID: \"7d68804a-07fb-4472-b820-b4a573c6fa5e\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.247588 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxcj\" (UniqueName: \"kubernetes.io/projected/a4a51000-7d7d-4086-a484-bc1206e61efd-kube-api-access-4zxcj\") pod \"cinder-operator-controller-manager-8d874c8fc-nn6m6\" (UID: \"a4a51000-7d7d-4086-a484-bc1206e61efd\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.266113 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.267604 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.273914 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.274247 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4422x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.274777 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.283349 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7tx4x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.287556 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.304360 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.309184 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.313672 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vlw\" (UniqueName: \"kubernetes.io/projected/7d68804a-07fb-4472-b820-b4a573c6fa5e-kube-api-access-42vlw\") pod \"horizon-operator-controller-manager-5fb775575f-vbzg9\" (UID: \"7d68804a-07fb-4472-b820-b4a573c6fa5e\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.316216 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lclg8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.316740 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fslc\" (UniqueName: \"kubernetes.io/projected/1730fca3-3542-4276-9206-d786273fbbcf-kube-api-access-7fslc\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-tbjxg\" (UID: \"1730fca3-3542-4276-9206-d786273fbbcf\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.318905 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.322432 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g42z\" (UniqueName: \"kubernetes.io/projected/f0af8317-a52f-4860-afb9-bf87fd8b5a9c-kube-api-access-8g42z\") pod \"glance-operator-controller-manager-8886f4c47-tfhhx\" (UID: \"f0af8317-a52f-4860-afb9-bf87fd8b5a9c\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.323354 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.323732 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxcj\" (UniqueName: \"kubernetes.io/projected/a4a51000-7d7d-4086-a484-bc1206e61efd-kube-api-access-4zxcj\") pod \"cinder-operator-controller-manager-8d874c8fc-nn6m6\" (UID: \"a4a51000-7d7d-4086-a484-bc1206e61efd\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.332095 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.332455 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.341821 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.342615 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.342694 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.353907 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.354642 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356404 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4fj\" (UniqueName: \"kubernetes.io/projected/1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200-kube-api-access-rp4fj\") pod \"heat-operator-controller-manager-69d6db494d-lhdxk\" (UID: \"1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356518 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356586 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6d4g\" (UniqueName: \"kubernetes.io/projected/3b81f7fa-1d81-4157-bcda-fa30d8c904d0-kube-api-access-c6d4g\") pod \"designate-operator-controller-manager-6d9697b7f4-n5srj\" (UID: \"3b81f7fa-1d81-4157-bcda-fa30d8c904d0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356682 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69hf\" (UniqueName: \"kubernetes.io/projected/7f01d442-49f8-4409-a42b-9fc3529b1913-kube-api-access-h69hf\") pod \"ironic-operator-controller-manager-5f4b8bd54d-c5wwk\" (UID: \"7f01d442-49f8-4409-a42b-9fc3529b1913\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356751 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tk56\" (UniqueName: \"kubernetes.io/projected/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-kube-api-access-4tk56\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.356818 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrgws\" (UniqueName: \"kubernetes.io/projected/f6218b7d-d104-411c-9e35-f13fa9d7b381-kube-api-access-hrgws\") pod \"keystone-operator-controller-manager-84f48565d4-lplnz\" (UID: \"f6218b7d-d104-411c-9e35-f13fa9d7b381\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.357471 4546 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.365979 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert podName:cfdbf554-1ffa-45f1-a41f-7013fa78a1a7 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:15.865963736 +0000 UTC m=+686.516899753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert") pod "infra-operator-controller-manager-79955696d6-f524g" (UID: "cfdbf554-1ffa-45f1-a41f-7013fa78a1a7") : secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.361715 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hqk77" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.361794 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ktgwz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.383799 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.389393 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4fj\" (UniqueName: \"kubernetes.io/projected/1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200-kube-api-access-rp4fj\") pod \"heat-operator-controller-manager-69d6db494d-lhdxk\" (UID: \"1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.406008 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.410703 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tk56\" (UniqueName: \"kubernetes.io/projected/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-kube-api-access-4tk56\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.419988 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.420029 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.420817 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.425289 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6d4g\" (UniqueName: \"kubernetes.io/projected/3b81f7fa-1d81-4157-bcda-fa30d8c904d0-kube-api-access-c6d4g\") pod \"designate-operator-controller-manager-6d9697b7f4-n5srj\" (UID: \"3b81f7fa-1d81-4157-bcda-fa30d8c904d0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.425506 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.434100 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.434939 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.435188 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qf6nr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.448287 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5kh7b" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.448485 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.459985 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nshn8\" (UniqueName: \"kubernetes.io/projected/ac5fa267-3d2a-46be-a790-01591d89d61a-kube-api-access-nshn8\") pod \"manila-operator-controller-manager-7dd968899f-t5rzn\" (UID: \"ac5fa267-3d2a-46be-a790-01591d89d61a\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460059 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljnk8\" (UniqueName: \"kubernetes.io/projected/bcc5200d-ffbe-4db8-9d56-3a0e880dafbb-kube-api-access-ljnk8\") pod \"neutron-operator-controller-manager-585dbc889-bqpf2\" (UID: \"bcc5200d-ffbe-4db8-9d56-3a0e880dafbb\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460086 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h69hf\" (UniqueName: \"kubernetes.io/projected/7f01d442-49f8-4409-a42b-9fc3529b1913-kube-api-access-h69hf\") pod \"ironic-operator-controller-manager-5f4b8bd54d-c5wwk\" (UID: \"7f01d442-49f8-4409-a42b-9fc3529b1913\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460144 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z87nt\" (UniqueName: \"kubernetes.io/projected/a9c99c4d-dd86-45a1-af29-a0583061329f-kube-api-access-z87nt\") pod \"ovn-operator-controller-manager-788c46999f-kgqrr\" (UID: \"a9c99c4d-dd86-45a1-af29-a0583061329f\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460166 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5r8\" (UniqueName: \"kubernetes.io/projected/1f28aee6-f110-4f42-a043-973026e50931-kube-api-access-dc5r8\") pod \"nova-operator-controller-manager-55bff696bd-jfh9x\" (UID: \"1f28aee6-f110-4f42-a043-973026e50931\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460215 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrgws\" (UniqueName: \"kubernetes.io/projected/f6218b7d-d104-411c-9e35-f13fa9d7b381-kube-api-access-hrgws\") pod \"keystone-operator-controller-manager-84f48565d4-lplnz\" (UID: \"f6218b7d-d104-411c-9e35-f13fa9d7b381\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460301 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stfs\" (UniqueName: \"kubernetes.io/projected/7cb67046-56b1-4ee1-9934-011f5b399566-kube-api-access-6stfs\") pod \"octavia-operator-controller-manager-6687f8d877-czc5g\" (UID: \"7cb67046-56b1-4ee1-9934-011f5b399566\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.460326 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9td\" (UniqueName: \"kubernetes.io/projected/ea2714a7-74c2-486c-963f-015bd55b7f3b-kube-api-access-pq9td\") pod \"mariadb-operator-controller-manager-67bf948998-zrbx8\" (UID: \"ea2714a7-74c2-486c-963f-015bd55b7f3b\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.484425 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69hf\" (UniqueName: \"kubernetes.io/projected/7f01d442-49f8-4409-a42b-9fc3529b1913-kube-api-access-h69hf\") pod \"ironic-operator-controller-manager-5f4b8bd54d-c5wwk\" (UID: \"7f01d442-49f8-4409-a42b-9fc3529b1913\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.497296 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrgws\" (UniqueName: \"kubernetes.io/projected/f6218b7d-d104-411c-9e35-f13fa9d7b381-kube-api-access-hrgws\") pod \"keystone-operator-controller-manager-84f48565d4-lplnz\" (UID: \"f6218b7d-d104-411c-9e35-f13fa9d7b381\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.503141 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.563147 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.566864 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dllv\" (UniqueName: \"kubernetes.io/projected/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-kube-api-access-6dllv\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.566913 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z87nt\" (UniqueName: \"kubernetes.io/projected/a9c99c4d-dd86-45a1-af29-a0583061329f-kube-api-access-z87nt\") pod \"ovn-operator-controller-manager-788c46999f-kgqrr\" (UID: \"a9c99c4d-dd86-45a1-af29-a0583061329f\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.566948 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.566966 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5r8\" (UniqueName: \"kubernetes.io/projected/1f28aee6-f110-4f42-a043-973026e50931-kube-api-access-dc5r8\") pod \"nova-operator-controller-manager-55bff696bd-jfh9x\" (UID: \"1f28aee6-f110-4f42-a043-973026e50931\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.567050 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stfs\" (UniqueName: \"kubernetes.io/projected/7cb67046-56b1-4ee1-9934-011f5b399566-kube-api-access-6stfs\") pod \"octavia-operator-controller-manager-6687f8d877-czc5g\" (UID: \"7cb67046-56b1-4ee1-9934-011f5b399566\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.567082 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9td\" (UniqueName: \"kubernetes.io/projected/ea2714a7-74c2-486c-963f-015bd55b7f3b-kube-api-access-pq9td\") pod \"mariadb-operator-controller-manager-67bf948998-zrbx8\" (UID: \"ea2714a7-74c2-486c-963f-015bd55b7f3b\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.567181 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nshn8\" (UniqueName: \"kubernetes.io/projected/ac5fa267-3d2a-46be-a790-01591d89d61a-kube-api-access-nshn8\") pod \"manila-operator-controller-manager-7dd968899f-t5rzn\" (UID: \"ac5fa267-3d2a-46be-a790-01591d89d61a\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.567232 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljnk8\" (UniqueName: \"kubernetes.io/projected/bcc5200d-ffbe-4db8-9d56-3a0e880dafbb-kube-api-access-ljnk8\") pod \"neutron-operator-controller-manager-585dbc889-bqpf2\" (UID: \"bcc5200d-ffbe-4db8-9d56-3a0e880dafbb\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.576768 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.577558 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.593439 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gsrgr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.598752 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.599840 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5r8\" (UniqueName: \"kubernetes.io/projected/1f28aee6-f110-4f42-a043-973026e50931-kube-api-access-dc5r8\") pod \"nova-operator-controller-manager-55bff696bd-jfh9x\" (UID: \"1f28aee6-f110-4f42-a043-973026e50931\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.600895 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljnk8\" (UniqueName: \"kubernetes.io/projected/bcc5200d-ffbe-4db8-9d56-3a0e880dafbb-kube-api-access-ljnk8\") pod \"neutron-operator-controller-manager-585dbc889-bqpf2\" (UID: \"bcc5200d-ffbe-4db8-9d56-3a0e880dafbb\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.620980 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nshn8\" (UniqueName: \"kubernetes.io/projected/ac5fa267-3d2a-46be-a790-01591d89d61a-kube-api-access-nshn8\") pod \"manila-operator-controller-manager-7dd968899f-t5rzn\" (UID: \"ac5fa267-3d2a-46be-a790-01591d89d61a\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.621437 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stfs\" (UniqueName: \"kubernetes.io/projected/7cb67046-56b1-4ee1-9934-011f5b399566-kube-api-access-6stfs\") pod \"octavia-operator-controller-manager-6687f8d877-czc5g\" (UID: \"7cb67046-56b1-4ee1-9934-011f5b399566\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.623081 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.623818 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.634206 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-wmgk7" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.636377 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9td\" (UniqueName: \"kubernetes.io/projected/ea2714a7-74c2-486c-963f-015bd55b7f3b-kube-api-access-pq9td\") pod \"mariadb-operator-controller-manager-67bf948998-zrbx8\" (UID: \"ea2714a7-74c2-486c-963f-015bd55b7f3b\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.637383 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z87nt\" (UniqueName: \"kubernetes.io/projected/a9c99c4d-dd86-45a1-af29-a0583061329f-kube-api-access-z87nt\") pod \"ovn-operator-controller-manager-788c46999f-kgqrr\" (UID: \"a9c99c4d-dd86-45a1-af29-a0583061329f\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.660067 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.675563 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srq2x\" (UniqueName: \"kubernetes.io/projected/230e3dd1-d069-4350-abb2-612a85465ea1-kube-api-access-srq2x\") pod \"placement-operator-controller-manager-5b964cf4cd-zddkx\" (UID: \"230e3dd1-d069-4350-abb2-612a85465ea1\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.675599 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whr72\" (UniqueName: \"kubernetes.io/projected/75a1fdba-e747-4bd1-8ef0-0afab1e16130-kube-api-access-whr72\") pod \"swift-operator-controller-manager-68fc8c869-gz6jj\" (UID: \"75a1fdba-e747-4bd1-8ef0-0afab1e16130\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.675633 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dllv\" (UniqueName: \"kubernetes.io/projected/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-kube-api-access-6dllv\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.675655 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.675828 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.675891 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:16.175877548 +0000 UTC m=+686.826813564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.692139 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.710457 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dllv\" (UniqueName: \"kubernetes.io/projected/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-kube-api-access-6dllv\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.741064 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.745123 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.749933 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.762718 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.784501 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.793729 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.797223 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.797259 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.797573 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.804493 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.807023 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.814484 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jb8wb" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.815819 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9kckz" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.823339 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.827970 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.834788 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srq2x\" (UniqueName: \"kubernetes.io/projected/230e3dd1-d069-4350-abb2-612a85465ea1-kube-api-access-srq2x\") pod \"placement-operator-controller-manager-5b964cf4cd-zddkx\" (UID: \"230e3dd1-d069-4350-abb2-612a85465ea1\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.837611 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whr72\" (UniqueName: \"kubernetes.io/projected/75a1fdba-e747-4bd1-8ef0-0afab1e16130-kube-api-access-whr72\") pod \"swift-operator-controller-manager-68fc8c869-gz6jj\" (UID: \"75a1fdba-e747-4bd1-8ef0-0afab1e16130\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.876662 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-cm5ll"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.877916 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.880686 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-85j98" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.886131 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srq2x\" (UniqueName: \"kubernetes.io/projected/230e3dd1-d069-4350-abb2-612a85465ea1-kube-api-access-srq2x\") pod \"placement-operator-controller-manager-5b964cf4cd-zddkx\" (UID: \"230e3dd1-d069-4350-abb2-612a85465ea1\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.902612 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.906081 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-cm5ll"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.916524 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whr72\" (UniqueName: \"kubernetes.io/projected/75a1fdba-e747-4bd1-8ef0-0afab1e16130-kube-api-access-whr72\") pod \"swift-operator-controller-manager-68fc8c869-gz6jj\" (UID: \"75a1fdba-e747-4bd1-8ef0-0afab1e16130\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.935058 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.945186 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.945290 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtnk\" (UniqueName: \"kubernetes.io/projected/54ff073b-b484-4f60-a66c-527d1c2eebe2-kube-api-access-qrtnk\") pod \"test-operator-controller-manager-56f8bfcd9f-s5zjw\" (UID: \"54ff073b-b484-4f60-a66c-527d1c2eebe2\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.945360 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6m5h\" (UniqueName: \"kubernetes.io/projected/013e345a-5f1a-41ac-8a6b-c33a35294181-kube-api-access-n6m5h\") pod \"telemetry-operator-controller-manager-64b5b76f97-7n4l2\" (UID: \"013e345a-5f1a-41ac-8a6b-c33a35294181\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.945758 4546 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: E0201 06:54:15.945812 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert podName:cfdbf554-1ffa-45f1-a41f-7013fa78a1a7 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:16.945797269 +0000 UTC m=+687.596733274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert") pod "infra-operator-controller-manager-79955696d6-f524g" (UID: "cfdbf554-1ffa-45f1-a41f-7013fa78a1a7") : secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.966639 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln"] Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.968633 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.972303 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qrdp7" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.972490 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 01 06:54:15 crc kubenswrapper[4546]: I0201 06:54:15.972598 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.000595 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.001944 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.030589 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.031428 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.035573 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-76ptw" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.038502 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.047083 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtnk\" (UniqueName: \"kubernetes.io/projected/54ff073b-b484-4f60-a66c-527d1c2eebe2-kube-api-access-qrtnk\") pod \"test-operator-controller-manager-56f8bfcd9f-s5zjw\" (UID: \"54ff073b-b484-4f60-a66c-527d1c2eebe2\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.047183 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tjcx\" (UniqueName: \"kubernetes.io/projected/80be6ca8-0db1-4bd1-948c-44947f6f737f-kube-api-access-7tjcx\") pod \"watcher-operator-controller-manager-564965969-cm5ll\" (UID: \"80be6ca8-0db1-4bd1-948c-44947f6f737f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.047323 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m5h\" (UniqueName: \"kubernetes.io/projected/013e345a-5f1a-41ac-8a6b-c33a35294181-kube-api-access-n6m5h\") pod \"telemetry-operator-controller-manager-64b5b76f97-7n4l2\" (UID: \"013e345a-5f1a-41ac-8a6b-c33a35294181\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.066326 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6m5h\" (UniqueName: \"kubernetes.io/projected/013e345a-5f1a-41ac-8a6b-c33a35294181-kube-api-access-n6m5h\") pod \"telemetry-operator-controller-manager-64b5b76f97-7n4l2\" (UID: \"013e345a-5f1a-41ac-8a6b-c33a35294181\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.076419 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtnk\" (UniqueName: \"kubernetes.io/projected/54ff073b-b484-4f60-a66c-527d1c2eebe2-kube-api-access-qrtnk\") pod \"test-operator-controller-manager-56f8bfcd9f-s5zjw\" (UID: \"54ff073b-b484-4f60-a66c-527d1c2eebe2\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.119416 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.149161 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tjcx\" (UniqueName: \"kubernetes.io/projected/80be6ca8-0db1-4bd1-948c-44947f6f737f-kube-api-access-7tjcx\") pod \"watcher-operator-controller-manager-564965969-cm5ll\" (UID: \"80be6ca8-0db1-4bd1-948c-44947f6f737f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.149515 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.149731 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrhs\" (UniqueName: \"kubernetes.io/projected/32163f3e-1db1-4e14-97f4-7341dfe224f6-kube-api-access-7qrhs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zh87z\" (UID: \"32163f3e-1db1-4e14-97f4-7341dfe224f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.149833 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhrh\" (UniqueName: \"kubernetes.io/projected/f165b121-cc85-4b6d-90a9-841971062150-kube-api-access-7vhrh\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.149924 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.167722 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tjcx\" (UniqueName: \"kubernetes.io/projected/80be6ca8-0db1-4bd1-948c-44947f6f737f-kube-api-access-7tjcx\") pod \"watcher-operator-controller-manager-564965969-cm5ll\" (UID: \"80be6ca8-0db1-4bd1-948c-44947f6f737f\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.183126 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.192062 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.219728 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.224582 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.251419 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.251573 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.251614 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrhs\" (UniqueName: \"kubernetes.io/projected/32163f3e-1db1-4e14-97f4-7341dfe224f6-kube-api-access-7qrhs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zh87z\" (UID: \"32163f3e-1db1-4e14-97f4-7341dfe224f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.251652 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhrh\" (UniqueName: \"kubernetes.io/projected/f165b121-cc85-4b6d-90a9-841971062150-kube-api-access-7vhrh\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.251688 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.251836 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.251937 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:16.751919093 +0000 UTC m=+687.402855109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.252337 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.252365 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:17.252354553 +0000 UTC m=+687.903290560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.252399 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.252428 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:16.752421929 +0000 UTC m=+687.403357936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.278195 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhrh\" (UniqueName: \"kubernetes.io/projected/f165b121-cc85-4b6d-90a9-841971062150-kube-api-access-7vhrh\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.280302 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrhs\" (UniqueName: \"kubernetes.io/projected/32163f3e-1db1-4e14-97f4-7341dfe224f6-kube-api-access-7qrhs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zh87z\" (UID: \"32163f3e-1db1-4e14-97f4-7341dfe224f6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.324834 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" event={"ID":"1730fca3-3542-4276-9206-d786273fbbcf","Type":"ContainerStarted","Data":"cc713d598bd2eaa34d5cd29714e99ae1f5f6dd2980e873e1870d4324d2aa99c2"} Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.358000 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.403480 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.431906 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.436094 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx"] Feb 01 06:54:16 crc kubenswrapper[4546]: W0201 06:54:16.446016 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1320b9af_c5c3_4bbb_8cc2_b0b5d0a77200.slice/crio-fea766aa0a0694433c74725403d8a9d81a1a8ebc5162976788e62b8bf88e38fd WatchSource:0}: Error finding container fea766aa0a0694433c74725403d8a9d81a1a8ebc5162976788e62b8bf88e38fd: Status 404 returned error can't find the container with id fea766aa0a0694433c74725403d8a9d81a1a8ebc5162976788e62b8bf88e38fd Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.764056 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.764135 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.764293 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.764409 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:17.764382629 +0000 UTC m=+688.415318645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.764320 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.764944 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:17.764931303 +0000 UTC m=+688.415867319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.840674 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.855360 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.865651 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.918193 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x"] Feb 01 06:54:16 crc kubenswrapper[4546]: W0201 06:54:16.918443 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f01d442_49f8_4409_a42b_9fc3529b1913.slice/crio-1b1eea3f7e5475ee9177e13d1fe5208524af2d9994d92e46a80998d0a2257afe WatchSource:0}: Error finding container 1b1eea3f7e5475ee9177e13d1fe5208524af2d9994d92e46a80998d0a2257afe: Status 404 returned error can't find the container with id 1b1eea3f7e5475ee9177e13d1fe5208524af2d9994d92e46a80998d0a2257afe Feb 01 06:54:16 crc kubenswrapper[4546]: W0201 06:54:16.930061 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6218b7d_d104_411c_9e35_f13fa9d7b381.slice/crio-37459b205511f09dde922bffebfc4964b50eb09aae5e612ec7f6315a2e661e82 WatchSource:0}: Error finding container 37459b205511f09dde922bffebfc4964b50eb09aae5e612ec7f6315a2e661e82: Status 404 returned error can't find the container with id 37459b205511f09dde922bffebfc4964b50eb09aae5e612ec7f6315a2e661e82 Feb 01 06:54:16 crc kubenswrapper[4546]: W0201 06:54:16.930735 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc5200d_ffbe_4db8_9d56_3a0e880dafbb.slice/crio-b3a595514a9be2eede732614b5dc4b35451d00ab8768b206e948d46f48e4ecbb WatchSource:0}: Error finding container b3a595514a9be2eede732614b5dc4b35451d00ab8768b206e948d46f48e4ecbb: Status 404 returned error can't find the container with id b3a595514a9be2eede732614b5dc4b35451d00ab8768b206e948d46f48e4ecbb Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.932665 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.940789 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.947121 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2"] Feb 01 06:54:16 crc kubenswrapper[4546]: I0201 06:54:16.968608 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.968776 4546 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:16 crc kubenswrapper[4546]: E0201 06:54:16.968844 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert podName:cfdbf554-1ffa-45f1-a41f-7013fa78a1a7 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:18.968825453 +0000 UTC m=+689.619761469 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert") pod "infra-operator-controller-manager-79955696d6-f524g" (UID: "cfdbf554-1ffa-45f1-a41f-7013fa78a1a7") : secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.020561 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw"] Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.023329 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ff073b_b484_4f60_a66c_527d1c2eebe2.slice/crio-6f95986e644e1da84e288586c1b68d655dfc245a0e54902a0845a021513899c2 WatchSource:0}: Error finding container 6f95986e644e1da84e288586c1b68d655dfc245a0e54902a0845a021513899c2: Status 404 returned error can't find the container with id 6f95986e644e1da84e288586c1b68d655dfc245a0e54902a0845a021513899c2 Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.026514 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a1fdba_e747_4bd1_8ef0_0afab1e16130.slice/crio-3025cc21a45d80baefcb2c2304f800332c90589d07a782d8215c6e479f16aa19 WatchSource:0}: Error finding container 3025cc21a45d80baefcb2c2304f800332c90589d07a782d8215c6e479f16aa19: Status 404 returned error can't find the container with id 3025cc21a45d80baefcb2c2304f800332c90589d07a782d8215c6e479f16aa19 Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.029179 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj"] Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.037678 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2"] Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.040126 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013e345a_5f1a_41ac_8a6b_c33a35294181.slice/crio-339c8022d87810b7e5e413d1e667ff7cfeea11f69d4256ce4f3cab12f65c427d WatchSource:0}: Error finding container 339c8022d87810b7e5e413d1e667ff7cfeea11f69d4256ce4f3cab12f65c427d: Status 404 returned error can't find the container with id 339c8022d87810b7e5e413d1e667ff7cfeea11f69d4256ce4f3cab12f65c427d Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.043235 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n6m5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-7n4l2_openstack-operators(013e345a-5f1a-41ac-8a6b-c33a35294181): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.044584 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" podUID="013e345a-5f1a-41ac-8a6b-c33a35294181" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.057484 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx"] Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.064904 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr"] Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.074301 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-cm5ll"] Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.079468 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z"] Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.085022 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-srq2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-zddkx_openstack-operators(230e3dd1-d069-4350-abb2-612a85465ea1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.085781 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn"] Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.086395 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7tjcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-cm5ll_openstack-operators(80be6ca8-0db1-4bd1-948c-44947f6f737f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.086424 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" podUID="230e3dd1-d069-4350-abb2-612a85465ea1" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.088485 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" podUID="80be6ca8-0db1-4bd1-948c-44947f6f737f" Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.091479 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32163f3e_1db1_4e14_97f4_7341dfe224f6.slice/crio-c0d994006cc76451d8a302e351b85cddcfbe98e37a6eea2f810d62ac001f72de WatchSource:0}: Error finding container c0d994006cc76451d8a302e351b85cddcfbe98e37a6eea2f810d62ac001f72de: Status 404 returned error can't find the container with id c0d994006cc76451d8a302e351b85cddcfbe98e37a6eea2f810d62ac001f72de Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.092156 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c99c4d_dd86_45a1_af29_a0583061329f.slice/crio-9dd2af45b04b27ba909cb9620ae9059f645b54cb9006874dd030b0c7cd8b9aa6 WatchSource:0}: Error finding container 9dd2af45b04b27ba909cb9620ae9059f645b54cb9006874dd030b0c7cd8b9aa6: Status 404 returned error can't find the container with id 9dd2af45b04b27ba909cb9620ae9059f645b54cb9006874dd030b0c7cd8b9aa6 Feb 01 06:54:17 crc kubenswrapper[4546]: W0201 06:54:17.095934 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5fa267_3d2a_46be_a790_01591d89d61a.slice/crio-95f8439fcdea0360dd49cf8521194fd5e819a7d5f936a5b0ca39a011fa4cf318 WatchSource:0}: Error finding container 95f8439fcdea0360dd49cf8521194fd5e819a7d5f936a5b0ca39a011fa4cf318: Status 404 returned error can't find the container with id 95f8439fcdea0360dd49cf8521194fd5e819a7d5f936a5b0ca39a011fa4cf318 Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.097495 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7qrhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-zh87z_openstack-operators(32163f3e-1db1-4e14-97f4-7341dfe224f6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.097997 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z87nt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-kgqrr_openstack-operators(a9c99c4d-dd86-45a1-af29-a0583061329f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.098509 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nshn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-t5rzn_openstack-operators(ac5fa267-3d2a-46be-a790-01591d89d61a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.098656 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" podUID="32163f3e-1db1-4e14-97f4-7341dfe224f6" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.099568 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" podUID="a9c99c4d-dd86-45a1-af29-a0583061329f" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.099619 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" podUID="ac5fa267-3d2a-46be-a790-01591d89d61a" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.272323 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.272520 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.272568 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:19.272553644 +0000 UTC m=+689.923489660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.333582 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" event={"ID":"7d68804a-07fb-4472-b820-b4a573c6fa5e","Type":"ContainerStarted","Data":"98b45ad4ea2d83050e067821afdf70778407191560cce57b5ca9875fc2c8f492"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.334789 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" event={"ID":"013e345a-5f1a-41ac-8a6b-c33a35294181","Type":"ContainerStarted","Data":"339c8022d87810b7e5e413d1e667ff7cfeea11f69d4256ce4f3cab12f65c427d"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.335953 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" podUID="013e345a-5f1a-41ac-8a6b-c33a35294181" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.336693 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" event={"ID":"7cb67046-56b1-4ee1-9934-011f5b399566","Type":"ContainerStarted","Data":"c56cd77ceac181150f4b163840007c919edbc4a02f1f563e328e250c874a08be"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.338426 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" event={"ID":"1f28aee6-f110-4f42-a043-973026e50931","Type":"ContainerStarted","Data":"3ed0d91480cfe53010a313ac471ecd7d88b7743a0eeaf89dfe9353fe8ce57083"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.341039 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" event={"ID":"f6218b7d-d104-411c-9e35-f13fa9d7b381","Type":"ContainerStarted","Data":"37459b205511f09dde922bffebfc4964b50eb09aae5e612ec7f6315a2e661e82"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.342620 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" event={"ID":"1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200","Type":"ContainerStarted","Data":"fea766aa0a0694433c74725403d8a9d81a1a8ebc5162976788e62b8bf88e38fd"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.344283 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" event={"ID":"a9c99c4d-dd86-45a1-af29-a0583061329f","Type":"ContainerStarted","Data":"9dd2af45b04b27ba909cb9620ae9059f645b54cb9006874dd030b0c7cd8b9aa6"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.345338 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" podUID="a9c99c4d-dd86-45a1-af29-a0583061329f" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.348072 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" event={"ID":"ea2714a7-74c2-486c-963f-015bd55b7f3b","Type":"ContainerStarted","Data":"6ba8e8f4268ec77454091f40a05b41b1f3d98f61549b03ef65e4e335496a3e52"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.350559 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" event={"ID":"ac5fa267-3d2a-46be-a790-01591d89d61a","Type":"ContainerStarted","Data":"95f8439fcdea0360dd49cf8521194fd5e819a7d5f936a5b0ca39a011fa4cf318"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.352318 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" podUID="ac5fa267-3d2a-46be-a790-01591d89d61a" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.355809 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" event={"ID":"32163f3e-1db1-4e14-97f4-7341dfe224f6","Type":"ContainerStarted","Data":"c0d994006cc76451d8a302e351b85cddcfbe98e37a6eea2f810d62ac001f72de"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.357020 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" podUID="32163f3e-1db1-4e14-97f4-7341dfe224f6" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.357531 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" event={"ID":"230e3dd1-d069-4350-abb2-612a85465ea1","Type":"ContainerStarted","Data":"0b52a96da89d711d7af6a81867fccc3dcb8342ad00bcc41e9e949c3d74ef1a4c"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.361074 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" podUID="230e3dd1-d069-4350-abb2-612a85465ea1" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.363435 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" event={"ID":"54ff073b-b484-4f60-a66c-527d1c2eebe2","Type":"ContainerStarted","Data":"6f95986e644e1da84e288586c1b68d655dfc245a0e54902a0845a021513899c2"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.367320 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" event={"ID":"80be6ca8-0db1-4bd1-948c-44947f6f737f","Type":"ContainerStarted","Data":"8913b9973f115640341d00ce2a1ffcb0e47b30362d2f501854c6e3d4855e1d49"} Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.369752 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" podUID="80be6ca8-0db1-4bd1-948c-44947f6f737f" Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.372989 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" event={"ID":"a4a51000-7d7d-4086-a484-bc1206e61efd","Type":"ContainerStarted","Data":"e36b22d22ce97044228335126a9d83f134e092f8d38d99ce3b2638235421236a"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.374385 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" event={"ID":"75a1fdba-e747-4bd1-8ef0-0afab1e16130","Type":"ContainerStarted","Data":"3025cc21a45d80baefcb2c2304f800332c90589d07a782d8215c6e479f16aa19"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.379387 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" event={"ID":"f0af8317-a52f-4860-afb9-bf87fd8b5a9c","Type":"ContainerStarted","Data":"81e5f0e7c6ad11eab33bd7b1c73632c950c12d59ffe66329067215395c094618"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.381968 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" event={"ID":"bcc5200d-ffbe-4db8-9d56-3a0e880dafbb","Type":"ContainerStarted","Data":"b3a595514a9be2eede732614b5dc4b35451d00ab8768b206e948d46f48e4ecbb"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.383425 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" event={"ID":"7f01d442-49f8-4409-a42b-9fc3529b1913","Type":"ContainerStarted","Data":"1b1eea3f7e5475ee9177e13d1fe5208524af2d9994d92e46a80998d0a2257afe"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.386588 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" event={"ID":"3b81f7fa-1d81-4157-bcda-fa30d8c904d0","Type":"ContainerStarted","Data":"dcbbff5b4a08ae064d0ece114c7704b1c1f4c8d76c7c5f26219d08fd2f46e8f1"} Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.781119 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.782351 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.782473 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.782495 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:19.782455021 +0000 UTC m=+690.433391037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:17 crc kubenswrapper[4546]: I0201 06:54:17.782069 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:17 crc kubenswrapper[4546]: E0201 06:54:17.783164 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:19.783146454 +0000 UTC m=+690.434082471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.394218 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" podUID="ac5fa267-3d2a-46be-a790-01591d89d61a" Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.394913 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" podUID="32163f3e-1db1-4e14-97f4-7341dfe224f6" Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.394969 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" podUID="013e345a-5f1a-41ac-8a6b-c33a35294181" Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.397069 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" podUID="80be6ca8-0db1-4bd1-948c-44947f6f737f" Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.416567 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" podUID="230e3dd1-d069-4350-abb2-612a85465ea1" Feb 01 06:54:18 crc kubenswrapper[4546]: E0201 06:54:18.441714 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" podUID="a9c99c4d-dd86-45a1-af29-a0583061329f" Feb 01 06:54:19 crc kubenswrapper[4546]: I0201 06:54:19.004957 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.005098 4546 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.005140 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert podName:cfdbf554-1ffa-45f1-a41f-7013fa78a1a7 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:23.005127774 +0000 UTC m=+693.656063781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert") pod "infra-operator-controller-manager-79955696d6-f524g" (UID: "cfdbf554-1ffa-45f1-a41f-7013fa78a1a7") : secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: I0201 06:54:19.309802 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.310240 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.310288 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:23.310276553 +0000 UTC m=+693.961212570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: I0201 06:54:19.817670 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:19 crc kubenswrapper[4546]: I0201 06:54:19.817829 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.818763 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.818812 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:23.818800775 +0000 UTC m=+694.469736791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.819412 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:19 crc kubenswrapper[4546]: E0201 06:54:19.819710 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:23.81969518 +0000 UTC m=+694.470631196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: I0201 06:54:23.077624 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.077919 4546 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.078016 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert podName:cfdbf554-1ffa-45f1-a41f-7013fa78a1a7 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:31.077985133 +0000 UTC m=+701.728921150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert") pod "infra-operator-controller-manager-79955696d6-f524g" (UID: "cfdbf554-1ffa-45f1-a41f-7013fa78a1a7") : secret "infra-operator-webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: I0201 06:54:23.383376 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.383655 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.383754 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:31.38373191 +0000 UTC m=+702.034667927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: I0201 06:54:23.891120 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:23 crc kubenswrapper[4546]: I0201 06:54:23.891213 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.891291 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.891304 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.891343 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:31.891328142 +0000 UTC m=+702.542264168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:23 crc kubenswrapper[4546]: E0201 06:54:23.891360 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:31.891353339 +0000 UTC m=+702.542289365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:25 crc kubenswrapper[4546]: I0201 06:54:25.423395 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:54:25 crc kubenswrapper[4546]: I0201 06:54:25.423477 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:54:30 crc kubenswrapper[4546]: E0201 06:54:30.259222 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382" Feb 01 06:54:30 crc kubenswrapper[4546]: E0201 06:54:30.259904 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-whr72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-gz6jj_openstack-operators(75a1fdba-e747-4bd1-8ef0-0afab1e16130): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:30 crc kubenswrapper[4546]: E0201 06:54:30.261514 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" podUID="75a1fdba-e747-4bd1-8ef0-0afab1e16130" Feb 01 06:54:30 crc kubenswrapper[4546]: E0201 06:54:30.491034 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" podUID="75a1fdba-e747-4bd1-8ef0-0afab1e16130" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.050125 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.050321 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6stfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-czc5g_openstack-operators(7cb67046-56b1-4ee1-9934-011f5b399566): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.051838 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" podUID="7cb67046-56b1-4ee1-9934-011f5b399566" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.089905 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.097359 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf554-1ffa-45f1-a41f-7013fa78a1a7-cert\") pod \"infra-operator-controller-manager-79955696d6-f524g\" (UID: \"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.351403 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.394116 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.394338 4546 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.394390 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert podName:1b7a6296-7403-4384-ad36-8cc2baf9bcc4 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:47.394376115 +0000 UTC m=+718.045312131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" (UID: "1b7a6296-7403-4384-ad36-8cc2baf9bcc4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.498422 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" podUID="7cb67046-56b1-4ee1-9934-011f5b399566" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.905437 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:31 crc kubenswrapper[4546]: I0201 06:54:31.905525 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.905590 4546 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.905619 4546 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.905661 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:47.905624141 +0000 UTC m=+718.556560147 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "metrics-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.905678 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs podName:f165b121-cc85-4b6d-90a9-841971062150 nodeName:}" failed. No retries permitted until 2026-02-01 06:54:47.905670648 +0000 UTC m=+718.556606664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-5rsln" (UID: "f165b121-cc85-4b6d-90a9-841971062150") : secret "webhook-server-cert" not found Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.975168 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.975447 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrgws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-lplnz_openstack-operators(f6218b7d-d104-411c-9e35-f13fa9d7b381): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:31 crc kubenswrapper[4546]: E0201 06:54:31.977058 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" podUID="f6218b7d-d104-411c-9e35-f13fa9d7b381" Feb 01 06:54:32 crc kubenswrapper[4546]: E0201 06:54:32.504429 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" podUID="f6218b7d-d104-411c-9e35-f13fa9d7b381" Feb 01 06:54:32 crc kubenswrapper[4546]: E0201 06:54:32.684057 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c" Feb 01 06:54:32 crc kubenswrapper[4546]: E0201 06:54:32.684349 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7fslc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7b6c4d8c5f-tbjxg_openstack-operators(1730fca3-3542-4276-9206-d786273fbbcf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:32 crc kubenswrapper[4546]: E0201 06:54:32.685967 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" podUID="1730fca3-3542-4276-9206-d786273fbbcf" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.115658 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.116128 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrtnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-s5zjw_openstack-operators(54ff073b-b484-4f60-a66c-527d1c2eebe2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.117532 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" podUID="54ff073b-b484-4f60-a66c-527d1c2eebe2" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.509262 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" podUID="1730fca3-3542-4276-9206-d786273fbbcf" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.511388 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" podUID="54ff073b-b484-4f60-a66c-527d1c2eebe2" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.636562 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.636849 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dc5r8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-jfh9x_openstack-operators(1f28aee6-f110-4f42-a043-973026e50931): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:54:33 crc kubenswrapper[4546]: E0201 06:54:33.638656 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" podUID="1f28aee6-f110-4f42-a043-973026e50931" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.081116 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-f524g"] Feb 01 06:54:34 crc kubenswrapper[4546]: W0201 06:54:34.093819 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfdbf554_1ffa_45f1_a41f_7013fa78a1a7.slice/crio-bac68539a25199f0922c1977c1d72ba006e48a78c7665c0a2f2d67adf2b22389 WatchSource:0}: Error finding container bac68539a25199f0922c1977c1d72ba006e48a78c7665c0a2f2d67adf2b22389: Status 404 returned error can't find the container with id bac68539a25199f0922c1977c1d72ba006e48a78c7665c0a2f2d67adf2b22389 Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.520527 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" event={"ID":"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7","Type":"ContainerStarted","Data":"bac68539a25199f0922c1977c1d72ba006e48a78c7665c0a2f2d67adf2b22389"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.524696 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" event={"ID":"bcc5200d-ffbe-4db8-9d56-3a0e880dafbb","Type":"ContainerStarted","Data":"3548ab29e76e11966f9ef1d11504213cc8c4e3493cb0329c9210795065d220da"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.524926 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.532842 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" event={"ID":"a4a51000-7d7d-4086-a484-bc1206e61efd","Type":"ContainerStarted","Data":"67b48870b3d5670c5b6aabb9362bc6eaf635f30682980b1f016130e88abfeff1"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.533336 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.536421 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" event={"ID":"3b81f7fa-1d81-4157-bcda-fa30d8c904d0","Type":"ContainerStarted","Data":"8042ef4e74c21531be5b31d15f88a3c6bb1f3522b0a61669bcee89a524900e89"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.536479 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.538093 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" event={"ID":"f0af8317-a52f-4860-afb9-bf87fd8b5a9c","Type":"ContainerStarted","Data":"eee5931df3dec7d6d5bb60e3dc4125f4232986b9a75a2e18fbf3dd25db8cea5c"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.538153 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.539599 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" event={"ID":"7d68804a-07fb-4472-b820-b4a573c6fa5e","Type":"ContainerStarted","Data":"2d731723babdc5386180daaecb3191f3d70e28c2c5cd75146508d4535ba8bc24"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.539710 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.541601 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" event={"ID":"1320b9af-c5c3-4bbb-8cc2-b0b5d0a77200","Type":"ContainerStarted","Data":"0beec1cf6e603ad64903b54df3397917929e0436f8950fd5513f9804087408c2"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.541641 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.543579 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" event={"ID":"7f01d442-49f8-4409-a42b-9fc3529b1913","Type":"ContainerStarted","Data":"f277741e4bfd33ffad12a84962be153b0c12140a28f5a3887a9cf5d5ae12445f"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.543949 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.552431 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" podStartSLOduration=2.84420299 podStartE2EDuration="19.552420671s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.934474089 +0000 UTC m=+687.585410106" lastFinishedPulling="2026-02-01 06:54:33.64269177 +0000 UTC m=+704.293627787" observedRunningTime="2026-02-01 06:54:34.550984644 +0000 UTC m=+705.201920660" watchObservedRunningTime="2026-02-01 06:54:34.552420671 +0000 UTC m=+705.203356688" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.559771 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" event={"ID":"ea2714a7-74c2-486c-963f-015bd55b7f3b","Type":"ContainerStarted","Data":"ad130039324d45b7690959d55a9ab812e22f3497b9b3fca19b76bda7a34765dc"} Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.559848 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.582227 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" podStartSLOduration=3.31677395 podStartE2EDuration="20.582215174s" podCreationTimestamp="2026-02-01 06:54:14 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.374281961 +0000 UTC m=+687.025217977" lastFinishedPulling="2026-02-01 06:54:33.639723186 +0000 UTC m=+704.290659201" observedRunningTime="2026-02-01 06:54:34.577155207 +0000 UTC m=+705.228091213" watchObservedRunningTime="2026-02-01 06:54:34.582215174 +0000 UTC m=+705.233151181" Feb 01 06:54:34 crc kubenswrapper[4546]: E0201 06:54:34.586321 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" podUID="1f28aee6-f110-4f42-a043-973026e50931" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.613446 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" podStartSLOduration=2.451399026 podStartE2EDuration="19.613427832s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.480503254 +0000 UTC m=+687.131439270" lastFinishedPulling="2026-02-01 06:54:33.64253206 +0000 UTC m=+704.293468076" observedRunningTime="2026-02-01 06:54:34.602167597 +0000 UTC m=+705.253103612" watchObservedRunningTime="2026-02-01 06:54:34.613427832 +0000 UTC m=+705.264363848" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.627950 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" podStartSLOduration=2.433405937 podStartE2EDuration="19.627937904s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.449389834 +0000 UTC m=+687.100325849" lastFinishedPulling="2026-02-01 06:54:33.6439218 +0000 UTC m=+704.294857816" observedRunningTime="2026-02-01 06:54:34.620097054 +0000 UTC m=+705.271033060" watchObservedRunningTime="2026-02-01 06:54:34.627937904 +0000 UTC m=+705.278873921" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.656665 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" podStartSLOduration=2.458805087 podStartE2EDuration="19.65664583s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.446101664 +0000 UTC m=+687.097037671" lastFinishedPulling="2026-02-01 06:54:33.643942398 +0000 UTC m=+704.294878414" observedRunningTime="2026-02-01 06:54:34.642743083 +0000 UTC m=+705.293679098" watchObservedRunningTime="2026-02-01 06:54:34.65664583 +0000 UTC m=+705.307581846" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.706146 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" podStartSLOduration=2.906034585 podStartE2EDuration="19.706133417s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.842430583 +0000 UTC m=+687.493366599" lastFinishedPulling="2026-02-01 06:54:33.642529415 +0000 UTC m=+704.293465431" observedRunningTime="2026-02-01 06:54:34.685954138 +0000 UTC m=+705.336890153" watchObservedRunningTime="2026-02-01 06:54:34.706133417 +0000 UTC m=+705.357069432" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.706336 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" podStartSLOduration=2.983244327 podStartE2EDuration="19.706331459s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.921197041 +0000 UTC m=+687.572133057" lastFinishedPulling="2026-02-01 06:54:33.644284173 +0000 UTC m=+704.295220189" observedRunningTime="2026-02-01 06:54:34.704951929 +0000 UTC m=+705.355887936" watchObservedRunningTime="2026-02-01 06:54:34.706331459 +0000 UTC m=+705.357267466" Feb 01 06:54:34 crc kubenswrapper[4546]: I0201 06:54:34.723159 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" podStartSLOduration=2.926135276 podStartE2EDuration="19.723150615s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.841819742 +0000 UTC m=+687.492755758" lastFinishedPulling="2026-02-01 06:54:33.638835081 +0000 UTC m=+704.289771097" observedRunningTime="2026-02-01 06:54:34.72145582 +0000 UTC m=+705.372391835" watchObservedRunningTime="2026-02-01 06:54:34.723150615 +0000 UTC m=+705.374086631" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.625959 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" event={"ID":"32163f3e-1db1-4e14-97f4-7341dfe224f6","Type":"ContainerStarted","Data":"b77e4c69eb6835e8d3db6d45dbe1f98df895add60c617fd9c86af4b5ec0bf1dc"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.627693 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" event={"ID":"013e345a-5f1a-41ac-8a6b-c33a35294181","Type":"ContainerStarted","Data":"403e04f7c40c89d7fde7e3c2a2c3e8f6b5b56a444040ceb6fc7c903a55b28abb"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.627851 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.628909 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" event={"ID":"230e3dd1-d069-4350-abb2-612a85465ea1","Type":"ContainerStarted","Data":"f944fc2ff956397326b6c77d934e39d98bda9152c33d40ec7a67b91e3ee202a2"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.629231 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.630406 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" event={"ID":"a9c99c4d-dd86-45a1-af29-a0583061329f","Type":"ContainerStarted","Data":"57f195de326efda977215b326b9d2bdbd500b911c3557b7cf7f27f094f989b92"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.630731 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.631885 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" event={"ID":"80be6ca8-0db1-4bd1-948c-44947f6f737f","Type":"ContainerStarted","Data":"3cb66b7ce5f41002c8d9b698ecf2feb87e8113fb46a2f622911d734307881a3d"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.632181 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.633596 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" event={"ID":"ac5fa267-3d2a-46be-a790-01591d89d61a","Type":"ContainerStarted","Data":"1e2c957423661e2b0cd63c94379e4bc2a2c268e1fc2adda9c3b355e3083d4aa8"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.633955 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.634957 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" event={"ID":"cfdbf554-1ffa-45f1-a41f-7013fa78a1a7","Type":"ContainerStarted","Data":"a6df4f8fbc65ebd284449ac01ef1eb18059119bd931356f198f3493b8d6e9ec0"} Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.635329 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.670978 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zh87z" podStartSLOduration=2.889307583 podStartE2EDuration="27.670968828s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.097351019 +0000 UTC m=+687.748287036" lastFinishedPulling="2026-02-01 06:54:41.879012275 +0000 UTC m=+712.529948281" observedRunningTime="2026-02-01 06:54:42.650529108 +0000 UTC m=+713.301465115" watchObservedRunningTime="2026-02-01 06:54:42.670968828 +0000 UTC m=+713.321904844" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.671886 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" podStartSLOduration=2.876907859 podStartE2EDuration="27.671882331s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.084020731 +0000 UTC m=+687.734956747" lastFinishedPulling="2026-02-01 06:54:41.878995203 +0000 UTC m=+712.529931219" observedRunningTime="2026-02-01 06:54:42.669260418 +0000 UTC m=+713.320196434" watchObservedRunningTime="2026-02-01 06:54:42.671882331 +0000 UTC m=+713.322818346" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.692910 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" podStartSLOduration=2.910394743 podStartE2EDuration="27.692903687s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.097926454 +0000 UTC m=+687.748862470" lastFinishedPulling="2026-02-01 06:54:41.880435398 +0000 UTC m=+712.531371414" observedRunningTime="2026-02-01 06:54:42.691691632 +0000 UTC m=+713.342627638" watchObservedRunningTime="2026-02-01 06:54:42.692903687 +0000 UTC m=+713.343839703" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.727767 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" podStartSLOduration=2.889891993 podStartE2EDuration="27.727759021s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.043069747 +0000 UTC m=+687.694005762" lastFinishedPulling="2026-02-01 06:54:41.880936773 +0000 UTC m=+712.531872790" observedRunningTime="2026-02-01 06:54:42.72547791 +0000 UTC m=+713.376413927" watchObservedRunningTime="2026-02-01 06:54:42.727759021 +0000 UTC m=+713.378695037" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.728906 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" podStartSLOduration=2.94323505 podStartE2EDuration="27.728901986s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.098381641 +0000 UTC m=+687.749317658" lastFinishedPulling="2026-02-01 06:54:41.884048578 +0000 UTC m=+712.534984594" observedRunningTime="2026-02-01 06:54:42.714237432 +0000 UTC m=+713.365173448" watchObservedRunningTime="2026-02-01 06:54:42.728901986 +0000 UTC m=+713.379838002" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.756625 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" podStartSLOduration=2.984798408 podStartE2EDuration="27.756617701s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.086217732 +0000 UTC m=+687.737153749" lastFinishedPulling="2026-02-01 06:54:41.858037026 +0000 UTC m=+712.508973042" observedRunningTime="2026-02-01 06:54:42.751446252 +0000 UTC m=+713.402382258" watchObservedRunningTime="2026-02-01 06:54:42.756617701 +0000 UTC m=+713.407553717" Feb 01 06:54:42 crc kubenswrapper[4546]: I0201 06:54:42.768062 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" podStartSLOduration=19.971892945 podStartE2EDuration="27.768056914s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:34.1013859 +0000 UTC m=+704.752321916" lastFinishedPulling="2026-02-01 06:54:41.897549859 +0000 UTC m=+712.548485885" observedRunningTime="2026-02-01 06:54:42.764218137 +0000 UTC m=+713.415154154" watchObservedRunningTime="2026-02-01 06:54:42.768056914 +0000 UTC m=+713.418992931" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.335266 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-nn6m6" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.386739 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-tfhhx" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.411776 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-lhdxk" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.428577 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-vbzg9" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.663831 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" event={"ID":"75a1fdba-e747-4bd1-8ef0-0afab1e16130","Type":"ContainerStarted","Data":"a31df8f8694c4c8dfbe722a89ab5b004a94c8b98a10a20a0663ebba35f132110"} Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.664228 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.664849 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-n5srj" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.700434 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-zrbx8" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.702901 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" podStartSLOduration=2.452512143 podStartE2EDuration="30.70288686s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.028154071 +0000 UTC m=+687.679090086" lastFinishedPulling="2026-02-01 06:54:45.278528787 +0000 UTC m=+715.929464803" observedRunningTime="2026-02-01 06:54:45.693885091 +0000 UTC m=+716.344821107" watchObservedRunningTime="2026-02-01 06:54:45.70288686 +0000 UTC m=+716.353822876" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.756567 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-bqpf2" Feb 01 06:54:45 crc kubenswrapper[4546]: I0201 06:54:45.832756 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-c5wwk" Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.667987 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" event={"ID":"f6218b7d-d104-411c-9e35-f13fa9d7b381","Type":"ContainerStarted","Data":"e67d2eda4e43910b48cad69b07b34ae77b0e85dbe7370e865000463407527d4b"} Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.669058 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.671795 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" event={"ID":"54ff073b-b484-4f60-a66c-527d1c2eebe2","Type":"ContainerStarted","Data":"dc4f506ed85ddad79bf5cb14b248e297ed4b18cc8d2de15dd8745b0096661545"} Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.672154 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.682079 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" podStartSLOduration=2.459700332 podStartE2EDuration="31.682044896s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.931974818 +0000 UTC m=+687.582910824" lastFinishedPulling="2026-02-01 06:54:46.154319371 +0000 UTC m=+716.805255388" observedRunningTime="2026-02-01 06:54:46.679234278 +0000 UTC m=+717.330170294" watchObservedRunningTime="2026-02-01 06:54:46.682044896 +0000 UTC m=+717.332980912" Feb 01 06:54:46 crc kubenswrapper[4546]: I0201 06:54:46.694967 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" podStartSLOduration=2.566929736 podStartE2EDuration="31.694951696s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:17.025541615 +0000 UTC m=+687.676477632" lastFinishedPulling="2026-02-01 06:54:46.153563577 +0000 UTC m=+716.804499592" observedRunningTime="2026-02-01 06:54:46.692915528 +0000 UTC m=+717.343851534" watchObservedRunningTime="2026-02-01 06:54:46.694951696 +0000 UTC m=+717.345887712" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.435373 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.452885 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b7a6296-7403-4384-ad36-8cc2baf9bcc4-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q\" (UID: \"1b7a6296-7403-4384-ad36-8cc2baf9bcc4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.624682 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.741058 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" event={"ID":"7cb67046-56b1-4ee1-9934-011f5b399566","Type":"ContainerStarted","Data":"19043ac81f5969685dc0c1ebae5b5472c54281feeea604770f6ace1ad4b22541"} Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.743568 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.781289 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" podStartSLOduration=2.37936609 podStartE2EDuration="32.781265783s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.837443161 +0000 UTC m=+687.488379178" lastFinishedPulling="2026-02-01 06:54:47.239342855 +0000 UTC m=+717.890278871" observedRunningTime="2026-02-01 06:54:47.779902072 +0000 UTC m=+718.430838088" watchObservedRunningTime="2026-02-01 06:54:47.781265783 +0000 UTC m=+718.432201798" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.952107 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.952546 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.969046 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:47 crc kubenswrapper[4546]: I0201 06:54:47.972887 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f165b121-cc85-4b6d-90a9-841971062150-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-5rsln\" (UID: \"f165b121-cc85-4b6d-90a9-841971062150\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.057528 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q"] Feb 01 06:54:48 crc kubenswrapper[4546]: W0201 06:54:48.067098 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7a6296_7403_4384_ad36_8cc2baf9bcc4.slice/crio-dcbdeb73ba23fc13f3c0f358b6750f992d76838670667b739461e78a10ce75df WatchSource:0}: Error finding container dcbdeb73ba23fc13f3c0f358b6750f992d76838670667b739461e78a10ce75df: Status 404 returned error can't find the container with id dcbdeb73ba23fc13f3c0f358b6750f992d76838670667b739461e78a10ce75df Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.145177 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.355646 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln"] Feb 01 06:54:48 crc kubenswrapper[4546]: W0201 06:54:48.379755 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf165b121_cc85_4b6d_90a9_841971062150.slice/crio-47dd569372e887c329723bb19d1bffd108dc66bbf8107e5ebdae8b27d8797205 WatchSource:0}: Error finding container 47dd569372e887c329723bb19d1bffd108dc66bbf8107e5ebdae8b27d8797205: Status 404 returned error can't find the container with id 47dd569372e887c329723bb19d1bffd108dc66bbf8107e5ebdae8b27d8797205 Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.749622 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" event={"ID":"1730fca3-3542-4276-9206-d786273fbbcf","Type":"ContainerStarted","Data":"b4eef01ec164282c98bd1b2c220b45da35c23872c827d1ed7e5a96fb1c08a1bc"} Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.750107 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.751365 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" event={"ID":"f165b121-cc85-4b6d-90a9-841971062150","Type":"ContainerStarted","Data":"7c577b9a494b910abb44b251146d88206832259ba05a613a3225320762d0b5a0"} Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.751485 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" event={"ID":"f165b121-cc85-4b6d-90a9-841971062150","Type":"ContainerStarted","Data":"47dd569372e887c329723bb19d1bffd108dc66bbf8107e5ebdae8b27d8797205"} Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.751515 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.753757 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" event={"ID":"1b7a6296-7403-4384-ad36-8cc2baf9bcc4","Type":"ContainerStarted","Data":"dcbdeb73ba23fc13f3c0f358b6750f992d76838670667b739461e78a10ce75df"} Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.773699 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" podStartSLOduration=2.722579065 podStartE2EDuration="34.77367296s" podCreationTimestamp="2026-02-01 06:54:14 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.212016043 +0000 UTC m=+686.862952059" lastFinishedPulling="2026-02-01 06:54:48.263109937 +0000 UTC m=+718.914045954" observedRunningTime="2026-02-01 06:54:48.76772113 +0000 UTC m=+719.418657146" watchObservedRunningTime="2026-02-01 06:54:48.77367296 +0000 UTC m=+719.424608976" Feb 01 06:54:48 crc kubenswrapper[4546]: I0201 06:54:48.799346 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" podStartSLOduration=33.799329413 podStartE2EDuration="33.799329413s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:54:48.797954751 +0000 UTC m=+719.448890768" watchObservedRunningTime="2026-02-01 06:54:48.799329413 +0000 UTC m=+719.450265429" Feb 01 06:54:49 crc kubenswrapper[4546]: I0201 06:54:49.770110 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" event={"ID":"1f28aee6-f110-4f42-a043-973026e50931","Type":"ContainerStarted","Data":"2dab5fb2ad6117cdb875673afe21cf5ed917b27f39c070b47ad740cc139d075a"} Feb 01 06:54:49 crc kubenswrapper[4546]: I0201 06:54:49.770922 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:50 crc kubenswrapper[4546]: I0201 06:54:50.780995 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" event={"ID":"1b7a6296-7403-4384-ad36-8cc2baf9bcc4","Type":"ContainerStarted","Data":"075520fcd67b8c759e7d7da5edfe09508af5a38fd428e06d8b8baab68965dea0"} Feb 01 06:54:50 crc kubenswrapper[4546]: I0201 06:54:50.781061 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:50 crc kubenswrapper[4546]: I0201 06:54:50.809270 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" podStartSLOduration=33.422131631 podStartE2EDuration="35.809258801s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:48.06937482 +0000 UTC m=+718.720310836" lastFinishedPulling="2026-02-01 06:54:50.456501991 +0000 UTC m=+721.107438006" observedRunningTime="2026-02-01 06:54:50.805869492 +0000 UTC m=+721.456805509" watchObservedRunningTime="2026-02-01 06:54:50.809258801 +0000 UTC m=+721.460194817" Feb 01 06:54:50 crc kubenswrapper[4546]: I0201 06:54:50.812280 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" podStartSLOduration=3.627146223 podStartE2EDuration="35.812272562s" podCreationTimestamp="2026-02-01 06:54:15 +0000 UTC" firstStartedPulling="2026-02-01 06:54:16.930477815 +0000 UTC m=+687.581413831" lastFinishedPulling="2026-02-01 06:54:49.115604154 +0000 UTC m=+719.766540170" observedRunningTime="2026-02-01 06:54:49.788959475 +0000 UTC m=+720.439895501" watchObservedRunningTime="2026-02-01 06:54:50.812272562 +0000 UTC m=+721.463208579" Feb 01 06:54:51 crc kubenswrapper[4546]: I0201 06:54:51.357076 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-f524g" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.326495 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-tbjxg" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.421198 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.421288 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.766090 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-czc5g" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.796773 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-lplnz" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.818105 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-jfh9x" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.839379 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-kgqrr" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.904915 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-t5rzn" Feb 01 06:54:55 crc kubenswrapper[4546]: I0201 06:54:55.938997 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-zddkx" Feb 01 06:54:56 crc kubenswrapper[4546]: I0201 06:54:56.006112 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-gz6jj" Feb 01 06:54:56 crc kubenswrapper[4546]: I0201 06:54:56.187731 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-s5zjw" Feb 01 06:54:56 crc kubenswrapper[4546]: I0201 06:54:56.194911 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-7n4l2" Feb 01 06:54:56 crc kubenswrapper[4546]: I0201 06:54:56.224613 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-cm5ll" Feb 01 06:54:57 crc kubenswrapper[4546]: I0201 06:54:57.629983 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7zvj6q" Feb 01 06:54:58 crc kubenswrapper[4546]: I0201 06:54:58.153190 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-5rsln" Feb 01 06:55:00 crc kubenswrapper[4546]: I0201 06:55:00.195060 4546 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.921134 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.922926 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.924718 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-54lkd" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.925279 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.926717 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.933371 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 01 06:55:12 crc kubenswrapper[4546]: I0201 06:55:12.934366 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.004229 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.005184 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.006554 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.015004 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.093643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcnlk\" (UniqueName: \"kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.094428 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.195544 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcnlk\" (UniqueName: \"kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.195727 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.195815 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.195952 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.196027 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgds\" (UniqueName: \"kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.196679 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.221950 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcnlk\" (UniqueName: \"kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk\") pod \"dnsmasq-dns-68c8986777-77ls5\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.241619 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.296525 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.296594 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.296619 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgds\" (UniqueName: \"kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.297811 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.297950 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.311585 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgds\" (UniqueName: \"kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds\") pod \"dnsmasq-dns-745dffd8b9-zjzcv\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.325227 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.455973 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:13 crc kubenswrapper[4546]: W0201 06:55:13.477024 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871ea9d2_4067_468b_b424_74e94ccb21d3.slice/crio-e23545d2f273b3303a0fcb2b65c5d8df2c6f3122db049e3fe8c0b885b9d7a4fe WatchSource:0}: Error finding container e23545d2f273b3303a0fcb2b65c5d8df2c6f3122db049e3fe8c0b885b9d7a4fe: Status 404 returned error can't find the container with id e23545d2f273b3303a0fcb2b65c5d8df2c6f3122db049e3fe8c0b885b9d7a4fe Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.734746 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.925217 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" event={"ID":"3eb5c8ca-d813-48ae-845c-2093dc75ac66","Type":"ContainerStarted","Data":"bb51f4edc6641c529d7ca144a9f798a89b51604c99fff15c8ff33fea935e9588"} Feb 01 06:55:13 crc kubenswrapper[4546]: I0201 06:55:13.926124 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c8986777-77ls5" event={"ID":"871ea9d2-4067-468b-b424-74e94ccb21d3","Type":"ContainerStarted","Data":"e23545d2f273b3303a0fcb2b65c5d8df2c6f3122db049e3fe8c0b885b9d7a4fe"} Feb 01 06:55:15 crc kubenswrapper[4546]: I0201 06:55:15.929930 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:15 crc kubenswrapper[4546]: I0201 06:55:15.981726 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:15 crc kubenswrapper[4546]: I0201 06:55:15.983615 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.054465 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.154163 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96q52\" (UniqueName: \"kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.154217 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.154317 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.256929 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.256996 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96q52\" (UniqueName: \"kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.257023 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.257820 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.258310 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.304545 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96q52\" (UniqueName: \"kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52\") pod \"dnsmasq-dns-668487f585-lq2g4\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.356173 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.402629 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.428773 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.429725 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.449013 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.563566 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.563619 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wkp\" (UniqueName: \"kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.563648 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.665498 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.665585 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wkp\" (UniqueName: \"kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.665617 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.666960 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.668438 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.683448 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wkp\" (UniqueName: \"kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp\") pod \"dnsmasq-dns-6bb7fd957f-mlbgp\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.783102 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.927768 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:16 crc kubenswrapper[4546]: W0201 06:55:16.954022 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec7c8ef_e98a_4313_b7b6_669c3b9217b9.slice/crio-6d482e3f9bdce8c15480b46f9cef0b7e684a221519ae440422d420940466f0b0 WatchSource:0}: Error finding container 6d482e3f9bdce8c15480b46f9cef0b7e684a221519ae440422d420940466f0b0: Status 404 returned error can't find the container with id 6d482e3f9bdce8c15480b46f9cef0b7e684a221519ae440422d420940466f0b0 Feb 01 06:55:16 crc kubenswrapper[4546]: I0201 06:55:16.985267 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668487f585-lq2g4" event={"ID":"dec7c8ef-e98a-4313-b7b6-669c3b9217b9","Type":"ContainerStarted","Data":"6d482e3f9bdce8c15480b46f9cef0b7e684a221519ae440422d420940466f0b0"} Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.170183 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.174995 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.175181 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.180206 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.180217 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.180585 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.182127 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.182479 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5njj2" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.182655 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.183310 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.243985 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:55:17 crc kubenswrapper[4546]: W0201 06:55:17.260493 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffe0876a_46be_41a8_8ca5_d99e28e349a6.slice/crio-f897537a8c8297c1eab5e212adc08d0161f7c2121a46f178290b12d8b572e427 WatchSource:0}: Error finding container f897537a8c8297c1eab5e212adc08d0161f7c2121a46f178290b12d8b572e427: Status 404 returned error can't find the container with id f897537a8c8297c1eab5e212adc08d0161f7c2121a46f178290b12d8b572e427 Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.379884 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.379955 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.379996 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380029 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380055 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380084 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkfdp\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380131 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380398 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380480 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380568 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.380660 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482509 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkfdp\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482583 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482602 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482617 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482639 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.482834 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.483131 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.483198 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.483214 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.483243 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.483259 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.484134 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.485250 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.485321 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.485981 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.486692 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.487039 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.492302 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.492776 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.493421 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.494986 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.501076 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkfdp\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.510077 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.551409 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.552574 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558266 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558406 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558286 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558609 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558693 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.558964 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.559039 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6xlwf" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.599004 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.686518 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.686850 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4fv\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.686900 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.686924 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.686970 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687037 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687066 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687087 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687106 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687123 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.687155 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.788806 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4fv\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.788870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.788895 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.788945 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.788998 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789029 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789049 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789080 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789099 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789135 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789178 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.789910 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.790114 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.791082 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.791797 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.792091 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.794473 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.794582 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.800339 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.802287 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.804725 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4fv\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.808712 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.809261 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.812334 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:17 crc kubenswrapper[4546]: I0201 06:55:17.899933 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.007444 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" event={"ID":"ffe0876a-46be-41a8-8ca5-d99e28e349a6","Type":"ContainerStarted","Data":"f897537a8c8297c1eab5e212adc08d0161f7c2121a46f178290b12d8b572e427"} Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.300419 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 06:55:18 crc kubenswrapper[4546]: W0201 06:55:18.316929 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9259854_6c00_413e_9061_399c808d9360.slice/crio-7b05fa1d0884ba595ff157260bc518ba05036ede66618511bdd4c27aaae77078 WatchSource:0}: Error finding container 7b05fa1d0884ba595ff157260bc518ba05036ede66618511bdd4c27aaae77078: Status 404 returned error can't find the container with id 7b05fa1d0884ba595ff157260bc518ba05036ede66618511bdd4c27aaae77078 Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.355451 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 06:55:18 crc kubenswrapper[4546]: W0201 06:55:18.371359 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a322342_7fc8_41ca_9ee3_4e1bbdbf5973.slice/crio-bd652127bb336c49fc87e99870b0a02ea8a4daf1718f26a3068b719eb0804b62 WatchSource:0}: Error finding container bd652127bb336c49fc87e99870b0a02ea8a4daf1718f26a3068b719eb0804b62: Status 404 returned error can't find the container with id bd652127bb336c49fc87e99870b0a02ea8a4daf1718f26a3068b719eb0804b62 Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.671679 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.673065 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.676287 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wczj6" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.676966 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.677445 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.677605 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.697023 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.697378 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.805992 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806049 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806148 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806188 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806225 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzmm\" (UniqueName: \"kubernetes.io/projected/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kube-api-access-pvzmm\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806259 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806282 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.806300 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909235 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909536 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909564 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909600 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909646 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909743 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909790 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.909826 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzmm\" (UniqueName: \"kubernetes.io/projected/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kube-api-access-pvzmm\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.912644 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-default\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.912643 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.912926 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.913590 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.913998 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kolla-config\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.920410 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.923797 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzmm\" (UniqueName: \"kubernetes.io/projected/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-kube-api-access-pvzmm\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.934879 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0675e35-afd6-4c93-a1fa-6a3ec2dd1190-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:18 crc kubenswrapper[4546]: I0201 06:55:18.988924 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190\") " pod="openstack/openstack-galera-0" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.012951 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.025314 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerStarted","Data":"7b05fa1d0884ba595ff157260bc518ba05036ede66618511bdd4c27aaae77078"} Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.040075 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerStarted","Data":"bd652127bb336c49fc87e99870b0a02ea8a4daf1718f26a3068b719eb0804b62"} Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.517575 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 06:55:19 crc kubenswrapper[4546]: W0201 06:55:19.526562 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0675e35_afd6_4c93_a1fa_6a3ec2dd1190.slice/crio-1a2d076ad25750a10c04582087a6626c227d38adb8b88bd92259080734265104 WatchSource:0}: Error finding container 1a2d076ad25750a10c04582087a6626c227d38adb8b88bd92259080734265104: Status 404 returned error can't find the container with id 1a2d076ad25750a10c04582087a6626c227d38adb8b88bd92259080734265104 Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.966778 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.970824 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.976149 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.977383 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.985556 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-h75lc" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.985687 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 01 06:55:19 crc kubenswrapper[4546]: I0201 06:55:19.991098 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.141773 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.141840 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.141918 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.141948 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.142031 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrhz\" (UniqueName: \"kubernetes.io/projected/067bc019-6975-420a-bc5c-0cee9f5ad72f-kube-api-access-6lrhz\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.142092 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.142122 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.142173 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.182461 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190","Type":"ContainerStarted","Data":"1a2d076ad25750a10c04582087a6626c227d38adb8b88bd92259080734265104"} Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.245904 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrhz\" (UniqueName: \"kubernetes.io/projected/067bc019-6975-420a-bc5c-0cee9f5ad72f-kube-api-access-6lrhz\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246238 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246261 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246298 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246343 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246368 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246419 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.246578 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.248201 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.249401 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.249735 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067bc019-6975-420a-bc5c-0cee9f5ad72f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.250136 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067bc019-6975-420a-bc5c-0cee9f5ad72f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.254998 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.258185 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067bc019-6975-420a-bc5c-0cee9f5ad72f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.280583 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrhz\" (UniqueName: \"kubernetes.io/projected/067bc019-6975-420a-bc5c-0cee9f5ad72f-kube-api-access-6lrhz\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.290263 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067bc019-6975-420a-bc5c-0cee9f5ad72f\") " pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.297690 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.354567 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.359468 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.362565 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.362584 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8p8kb" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.362808 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.375870 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.449318 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.449345 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6bq\" (UniqueName: \"kubernetes.io/projected/c414e341-7093-4755-a22e-1a47be4b1e4c-kube-api-access-8x6bq\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.449373 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-config-data\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.449413 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-kolla-config\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.449468 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.553552 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-config-data\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.553848 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-kolla-config\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.553922 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.553956 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.553971 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6bq\" (UniqueName: \"kubernetes.io/projected/c414e341-7093-4755-a22e-1a47be4b1e4c-kube-api-access-8x6bq\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.554743 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-kolla-config\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.555631 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c414e341-7093-4755-a22e-1a47be4b1e4c-config-data\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.561053 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.569482 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6bq\" (UniqueName: \"kubernetes.io/projected/c414e341-7093-4755-a22e-1a47be4b1e4c-kube-api-access-8x6bq\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.601026 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c414e341-7093-4755-a22e-1a47be4b1e4c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c414e341-7093-4755-a22e-1a47be4b1e4c\") " pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.741770 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 06:55:20 crc kubenswrapper[4546]: W0201 06:55:20.867244 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod067bc019_6975_420a_bc5c_0cee9f5ad72f.slice/crio-3e42b392af79cb609a2971af8bfe3f858b5da714c29fca749d7ff095dd0d4c15 WatchSource:0}: Error finding container 3e42b392af79cb609a2971af8bfe3f858b5da714c29fca749d7ff095dd0d4c15: Status 404 returned error can't find the container with id 3e42b392af79cb609a2971af8bfe3f858b5da714c29fca749d7ff095dd0d4c15 Feb 01 06:55:20 crc kubenswrapper[4546]: I0201 06:55:20.869633 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 06:55:21 crc kubenswrapper[4546]: I0201 06:55:21.191616 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067bc019-6975-420a-bc5c-0cee9f5ad72f","Type":"ContainerStarted","Data":"3e42b392af79cb609a2971af8bfe3f858b5da714c29fca749d7ff095dd0d4c15"} Feb 01 06:55:21 crc kubenswrapper[4546]: I0201 06:55:21.337636 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 06:55:21 crc kubenswrapper[4546]: W0201 06:55:21.388121 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc414e341_7093_4755_a22e_1a47be4b1e4c.slice/crio-8bf72787ce0d03eb6b2667b92721fce4a13359fd12d9f03681038b06fa3398c4 WatchSource:0}: Error finding container 8bf72787ce0d03eb6b2667b92721fce4a13359fd12d9f03681038b06fa3398c4: Status 404 returned error can't find the container with id 8bf72787ce0d03eb6b2667b92721fce4a13359fd12d9f03681038b06fa3398c4 Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.247120 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c414e341-7093-4755-a22e-1a47be4b1e4c","Type":"ContainerStarted","Data":"8bf72787ce0d03eb6b2667b92721fce4a13359fd12d9f03681038b06fa3398c4"} Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.557613 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.558483 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.563245 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2npng" Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.567060 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.708451 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh46z\" (UniqueName: \"kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z\") pod \"kube-state-metrics-0\" (UID: \"fa29cd22-5996-4415-92c9-8012caf2dcfb\") " pod="openstack/kube-state-metrics-0" Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.811296 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh46z\" (UniqueName: \"kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z\") pod \"kube-state-metrics-0\" (UID: \"fa29cd22-5996-4415-92c9-8012caf2dcfb\") " pod="openstack/kube-state-metrics-0" Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.843836 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh46z\" (UniqueName: \"kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z\") pod \"kube-state-metrics-0\" (UID: \"fa29cd22-5996-4415-92c9-8012caf2dcfb\") " pod="openstack/kube-state-metrics-0" Feb 01 06:55:22 crc kubenswrapper[4546]: I0201 06:55:22.890584 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 06:55:23 crc kubenswrapper[4546]: I0201 06:55:23.153580 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 06:55:23 crc kubenswrapper[4546]: I0201 06:55:23.174785 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 06:55:23 crc kubenswrapper[4546]: I0201 06:55:23.312929 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa29cd22-5996-4415-92c9-8012caf2dcfb","Type":"ContainerStarted","Data":"848310a9358a98807ffec056ecfe6bd125059678ddaf03ea23ddebcd2e8470c7"} Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.421533 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.421629 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.421672 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.422661 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.422722 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e" gracePeriod=600 Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.427508 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fw66k"] Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.428390 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.431741 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.432244 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.432437 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jvk2j" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.434371 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k"] Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.471910 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4x5w6"] Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.474762 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.492793 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4x5w6"] Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575171 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqfgz\" (UniqueName: \"kubernetes.io/projected/fc24faa8-1959-4a00-9859-80c7fb19bea3-kube-api-access-hqfgz\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575235 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc24faa8-1959-4a00-9859-80c7fb19bea3-scripts\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575281 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-log-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575299 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-combined-ca-bundle\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575320 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575379 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575425 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-lib\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575449 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-etc-ovs\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575468 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-run\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575522 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-ovn-controller-tls-certs\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575548 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-log\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575585 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dclnv\" (UniqueName: \"kubernetes.io/projected/7f0f1cbe-76e7-455a-80da-05602295973b-kube-api-access-dclnv\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.575621 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0f1cbe-76e7-455a-80da-05602295973b-scripts\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.676848 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc24faa8-1959-4a00-9859-80c7fb19bea3-scripts\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.676919 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-log-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.676960 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-combined-ca-bundle\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.676983 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677007 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677043 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-lib\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677065 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-etc-ovs\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677085 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-run\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677119 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-ovn-controller-tls-certs\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677148 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-log\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677165 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dclnv\" (UniqueName: \"kubernetes.io/projected/7f0f1cbe-76e7-455a-80da-05602295973b-kube-api-access-dclnv\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677191 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0f1cbe-76e7-455a-80da-05602295973b-scripts\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.677220 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqfgz\" (UniqueName: \"kubernetes.io/projected/fc24faa8-1959-4a00-9859-80c7fb19bea3-kube-api-access-hqfgz\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.680809 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-lib\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.680934 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-etc-ovs\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.681167 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-run\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.682600 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc24faa8-1959-4a00-9859-80c7fb19bea3-scripts\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.682729 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-log-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.688216 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run-ovn\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.688417 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f0f1cbe-76e7-455a-80da-05602295973b-var-run\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.688557 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fc24faa8-1959-4a00-9859-80c7fb19bea3-var-log\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.691730 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-combined-ca-bundle\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.693798 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqfgz\" (UniqueName: \"kubernetes.io/projected/fc24faa8-1959-4a00-9859-80c7fb19bea3-kube-api-access-hqfgz\") pod \"ovn-controller-ovs-4x5w6\" (UID: \"fc24faa8-1959-4a00-9859-80c7fb19bea3\") " pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.694900 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0f1cbe-76e7-455a-80da-05602295973b-ovn-controller-tls-certs\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.698521 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f0f1cbe-76e7-455a-80da-05602295973b-scripts\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.701899 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dclnv\" (UniqueName: \"kubernetes.io/projected/7f0f1cbe-76e7-455a-80da-05602295973b-kube-api-access-dclnv\") pod \"ovn-controller-fw66k\" (UID: \"7f0f1cbe-76e7-455a-80da-05602295973b\") " pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.787549 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k" Feb 01 06:55:25 crc kubenswrapper[4546]: I0201 06:55:25.823996 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.321790 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.336334 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.345162 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.345373 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.345553 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.345697 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.345837 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pgtpt" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.356250 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.382401 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e" exitCode=0 Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.382446 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e"} Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.382482 4546 scope.go:117] "RemoveContainer" containerID="75a51418488f257f1413aee0bcf03cd98552efa50d1a91c2d8fa14ab0a5d1e3c" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.388851 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389047 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbx46\" (UniqueName: \"kubernetes.io/projected/2025b014-b533-402c-af2f-179c921eb503-kube-api-access-wbx46\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389252 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389309 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389422 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389537 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389599 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-config\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.389627 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2025b014-b533-402c-af2f-179c921eb503-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.490922 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-config\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.490974 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2025b014-b533-402c-af2f-179c921eb503-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491032 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbx46\" (UniqueName: \"kubernetes.io/projected/2025b014-b533-402c-af2f-179c921eb503-kube-api-access-wbx46\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491051 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491082 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491102 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491150 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491189 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.491851 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2025b014-b533-402c-af2f-179c921eb503-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.492080 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.492679 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.492941 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2025b014-b533-402c-af2f-179c921eb503-config\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.506685 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.506874 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.507008 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbx46\" (UniqueName: \"kubernetes.io/projected/2025b014-b533-402c-af2f-179c921eb503-kube-api-access-wbx46\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.508124 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2025b014-b533-402c-af2f-179c921eb503-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.565172 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2025b014-b533-402c-af2f-179c921eb503\") " pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:26 crc kubenswrapper[4546]: I0201 06:55:26.668508 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:29 crc kubenswrapper[4546]: I0201 06:55:29.982318 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 06:55:29 crc kubenswrapper[4546]: I0201 06:55:29.986552 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:29 crc kubenswrapper[4546]: I0201 06:55:29.998312 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 01 06:55:29 crc kubenswrapper[4546]: I0201 06:55:29.999526 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:29.999942 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-djnvb" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.001242 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.035456 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077560 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077644 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077695 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncn2\" (UniqueName: \"kubernetes.io/projected/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-kube-api-access-dncn2\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077718 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077741 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-config\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077785 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077810 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.077840 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179577 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179671 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179709 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dncn2\" (UniqueName: \"kubernetes.io/projected/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-kube-api-access-dncn2\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179727 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179768 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-config\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179803 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179823 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.179847 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.181229 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-config\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.181736 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.181996 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.183833 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.185602 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.185704 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.187490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.198800 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncn2\" (UniqueName: \"kubernetes.io/projected/8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e-kube-api-access-dncn2\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.224310 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e\") " pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:30 crc kubenswrapper[4546]: I0201 06:55:30.323414 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:42 crc kubenswrapper[4546]: I0201 06:55:42.800695 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4x5w6"] Feb 01 06:55:42 crc kubenswrapper[4546]: I0201 06:55:42.932993 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 06:55:42 crc kubenswrapper[4546]: I0201 06:55:42.954727 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k"] Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.015319 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.316147 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.316203 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.316321 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtgds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-745dffd8b9-zjzcv_openstack(3eb5c8ca-d813-48ae-845c-2093dc75ac66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.317652 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" podUID="3eb5c8ca-d813-48ae-845c-2093dc75ac66" Feb 01 06:55:43 crc kubenswrapper[4546]: W0201 06:55:43.326616 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2025b014_b533_402c_af2f_179c921eb503.slice/crio-bd39ca59fc67254e17b9a62386b370f3e05c1abb4dbff63355b570259184b66a WatchSource:0}: Error finding container bd39ca59fc67254e17b9a62386b370f3e05c1abb4dbff63355b570259184b66a: Status 404 returned error can't find the container with id bd39ca59fc67254e17b9a62386b370f3e05c1abb4dbff63355b570259184b66a Feb 01 06:55:43 crc kubenswrapper[4546]: W0201 06:55:43.331879 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc24faa8_1959_4a00_9859_80c7fb19bea3.slice/crio-be073c79e1d2f18a43cc4c35bfcb1c62de4e51bcc67b35d0a0bcca9c6a1fe813 WatchSource:0}: Error finding container be073c79e1d2f18a43cc4c35bfcb1c62de4e51bcc67b35d0a0bcca9c6a1fe813: Status 404 returned error can't find the container with id be073c79e1d2f18a43cc4c35bfcb1c62de4e51bcc67b35d0a0bcca9c6a1fe813 Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.338788 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.338829 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.338937 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gcnlk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-68c8986777-77ls5_openstack(871ea9d2-4067-468b-b424-74e94ccb21d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:55:43 crc kubenswrapper[4546]: E0201 06:55:43.340174 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-68c8986777-77ls5" podUID="871ea9d2-4067-468b-b424-74e94ccb21d3" Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.571423 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2025b014-b533-402c-af2f-179c921eb503","Type":"ContainerStarted","Data":"bd39ca59fc67254e17b9a62386b370f3e05c1abb4dbff63355b570259184b66a"} Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.572291 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4x5w6" event={"ID":"fc24faa8-1959-4a00-9859-80c7fb19bea3","Type":"ContainerStarted","Data":"be073c79e1d2f18a43cc4c35bfcb1c62de4e51bcc67b35d0a0bcca9c6a1fe813"} Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.573573 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e","Type":"ContainerStarted","Data":"54f2275f7cca202fad3587c7c766aef32f93ede56ac11b5fa4ee580e01e485c4"} Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.575887 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d"} Feb 01 06:55:43 crc kubenswrapper[4546]: I0201 06:55:43.577757 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k" event={"ID":"7f0f1cbe-76e7-455a-80da-05602295973b","Type":"ContainerStarted","Data":"3d6047104378030dc81996b7de64fc7fa7062b7bc238b1dcee32b6a3eae6432b"} Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.428810 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.431324 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.533614 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgds\" (UniqueName: \"kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds\") pod \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.533691 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config\") pod \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.533728 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc\") pod \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\" (UID: \"3eb5c8ca-d813-48ae-845c-2093dc75ac66\") " Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.533786 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcnlk\" (UniqueName: \"kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk\") pod \"871ea9d2-4067-468b-b424-74e94ccb21d3\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.533832 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config\") pod \"871ea9d2-4067-468b-b424-74e94ccb21d3\" (UID: \"871ea9d2-4067-468b-b424-74e94ccb21d3\") " Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.534315 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3eb5c8ca-d813-48ae-845c-2093dc75ac66" (UID: "3eb5c8ca-d813-48ae-845c-2093dc75ac66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.534373 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config" (OuterVolumeSpecName: "config") pod "3eb5c8ca-d813-48ae-845c-2093dc75ac66" (UID: "3eb5c8ca-d813-48ae-845c-2093dc75ac66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.534431 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config" (OuterVolumeSpecName: "config") pod "871ea9d2-4067-468b-b424-74e94ccb21d3" (UID: "871ea9d2-4067-468b-b424-74e94ccb21d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.538193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds" (OuterVolumeSpecName: "kube-api-access-xtgds") pod "3eb5c8ca-d813-48ae-845c-2093dc75ac66" (UID: "3eb5c8ca-d813-48ae-845c-2093dc75ac66"). InnerVolumeSpecName "kube-api-access-xtgds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.541031 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk" (OuterVolumeSpecName: "kube-api-access-gcnlk") pod "871ea9d2-4067-468b-b424-74e94ccb21d3" (UID: "871ea9d2-4067-468b-b424-74e94ccb21d3"). InnerVolumeSpecName "kube-api-access-gcnlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.586789 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" event={"ID":"3eb5c8ca-d813-48ae-845c-2093dc75ac66","Type":"ContainerDied","Data":"bb51f4edc6641c529d7ca144a9f798a89b51604c99fff15c8ff33fea935e9588"} Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.586841 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745dffd8b9-zjzcv" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.592074 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c8986777-77ls5" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.592956 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c8986777-77ls5" event={"ID":"871ea9d2-4067-468b-b424-74e94ccb21d3","Type":"ContainerDied","Data":"e23545d2f273b3303a0fcb2b65c5d8df2c6f3122db049e3fe8c0b885b9d7a4fe"} Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.637694 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgds\" (UniqueName: \"kubernetes.io/projected/3eb5c8ca-d813-48ae-845c-2093dc75ac66-kube-api-access-xtgds\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.637735 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.638093 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3eb5c8ca-d813-48ae-845c-2093dc75ac66-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.638111 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcnlk\" (UniqueName: \"kubernetes.io/projected/871ea9d2-4067-468b-b424-74e94ccb21d3-kube-api-access-gcnlk\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.638125 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871ea9d2-4067-468b-b424-74e94ccb21d3-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.676252 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.680160 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745dffd8b9-zjzcv"] Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.697365 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:44 crc kubenswrapper[4546]: I0201 06:55:44.704042 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68c8986777-77ls5"] Feb 01 06:55:45 crc kubenswrapper[4546]: E0201 06:55:45.230562 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Feb 01 06:55:45 crc kubenswrapper[4546]: E0201 06:55:45.230838 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb" Feb 01 06:55:45 crc kubenswrapper[4546]: E0201 06:55:45.231059 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lh46z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(fa29cd22-5996-4415-92c9-8012caf2dcfb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:55:45 crc kubenswrapper[4546]: E0201 06:55:45.233027 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" Feb 01 06:55:45 crc kubenswrapper[4546]: I0201 06:55:45.605169 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067bc019-6975-420a-bc5c-0cee9f5ad72f","Type":"ContainerStarted","Data":"7e4ee85ce6a355d7bf2217aab59c04ede8627376149920a3e0c2b3ba34164a86"} Feb 01 06:55:45 crc kubenswrapper[4546]: I0201 06:55:45.612631 4546 generic.go:334] "Generic (PLEG): container finished" podID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerID="6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057" exitCode=0 Feb 01 06:55:45 crc kubenswrapper[4546]: I0201 06:55:45.612742 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668487f585-lq2g4" event={"ID":"dec7c8ef-e98a-4313-b7b6-669c3b9217b9","Type":"ContainerDied","Data":"6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057"} Feb 01 06:55:45 crc kubenswrapper[4546]: E0201 06:55:45.614726 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb\\\"\"" pod="openstack/kube-state-metrics-0" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" Feb 01 06:55:45 crc kubenswrapper[4546]: I0201 06:55:45.669343 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb5c8ca-d813-48ae-845c-2093dc75ac66" path="/var/lib/kubelet/pods/3eb5c8ca-d813-48ae-845c-2093dc75ac66/volumes" Feb 01 06:55:45 crc kubenswrapper[4546]: I0201 06:55:45.670777 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871ea9d2-4067-468b-b424-74e94ccb21d3" path="/var/lib/kubelet/pods/871ea9d2-4067-468b-b424-74e94ccb21d3/volumes" Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.623173 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668487f585-lq2g4" event={"ID":"dec7c8ef-e98a-4313-b7b6-669c3b9217b9","Type":"ContainerStarted","Data":"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.623965 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.626600 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c414e341-7093-4755-a22e-1a47be4b1e4c","Type":"ContainerStarted","Data":"44b0afca9a0338ef5cbd41163d4663a848c77e30ed7b9eb40fffe2bb7bd3c0f4"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.626750 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.628314 4546 generic.go:334] "Generic (PLEG): container finished" podID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerID="e6e8515ed52b771380fb00ddbbaadbcf2dbd128e1b9468f0ba1291f572892701" exitCode=0 Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.628497 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" event={"ID":"ffe0876a-46be-41a8-8ca5-d99e28e349a6","Type":"ContainerDied","Data":"e6e8515ed52b771380fb00ddbbaadbcf2dbd128e1b9468f0ba1291f572892701"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.630878 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerStarted","Data":"9ec81dd258fc5363154282f0f86b3edb322ae34700105e5e89c739bb777690b0"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.632688 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerStarted","Data":"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.638771 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668487f585-lq2g4" podStartSLOduration=4.233211452 podStartE2EDuration="31.638760951s" podCreationTimestamp="2026-02-01 06:55:15 +0000 UTC" firstStartedPulling="2026-02-01 06:55:16.965568383 +0000 UTC m=+747.616504399" lastFinishedPulling="2026-02-01 06:55:44.371117882 +0000 UTC m=+775.022053898" observedRunningTime="2026-02-01 06:55:46.638045352 +0000 UTC m=+777.288981368" watchObservedRunningTime="2026-02-01 06:55:46.638760951 +0000 UTC m=+777.289696967" Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.643990 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190","Type":"ContainerStarted","Data":"2c541cfca6e3e0f0481fd04a0c98f05c631601b788a6de5b28fa5346f3641364"} Feb 01 06:55:46 crc kubenswrapper[4546]: I0201 06:55:46.691256 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.3449769050000002 podStartE2EDuration="26.691239565s" podCreationTimestamp="2026-02-01 06:55:20 +0000 UTC" firstStartedPulling="2026-02-01 06:55:21.394099343 +0000 UTC m=+752.045035359" lastFinishedPulling="2026-02-01 06:55:44.740362002 +0000 UTC m=+775.391298019" observedRunningTime="2026-02-01 06:55:46.663506888 +0000 UTC m=+777.314442904" watchObservedRunningTime="2026-02-01 06:55:46.691239565 +0000 UTC m=+777.342175581" Feb 01 06:55:47 crc kubenswrapper[4546]: I0201 06:55:47.664958 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" event={"ID":"ffe0876a-46be-41a8-8ca5-d99e28e349a6","Type":"ContainerStarted","Data":"0596565a16df24567f2038eb908a208c5f1b2de43012231a2ed636c16f5b8eae"} Feb 01 06:55:47 crc kubenswrapper[4546]: I0201 06:55:47.665313 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:47 crc kubenswrapper[4546]: I0201 06:55:47.678375 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" podStartSLOduration=4.20242067 podStartE2EDuration="31.678359877s" podCreationTimestamp="2026-02-01 06:55:16 +0000 UTC" firstStartedPulling="2026-02-01 06:55:17.263583533 +0000 UTC m=+747.914519550" lastFinishedPulling="2026-02-01 06:55:44.73952274 +0000 UTC m=+775.390458757" observedRunningTime="2026-02-01 06:55:47.671890322 +0000 UTC m=+778.322826338" watchObservedRunningTime="2026-02-01 06:55:47.678359877 +0000 UTC m=+778.329295882" Feb 01 06:55:48 crc kubenswrapper[4546]: I0201 06:55:48.669293 4546 generic.go:334] "Generic (PLEG): container finished" podID="067bc019-6975-420a-bc5c-0cee9f5ad72f" containerID="7e4ee85ce6a355d7bf2217aab59c04ede8627376149920a3e0c2b3ba34164a86" exitCode=0 Feb 01 06:55:48 crc kubenswrapper[4546]: I0201 06:55:48.669362 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067bc019-6975-420a-bc5c-0cee9f5ad72f","Type":"ContainerDied","Data":"7e4ee85ce6a355d7bf2217aab59c04ede8627376149920a3e0c2b3ba34164a86"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.681735 4546 generic.go:334] "Generic (PLEG): container finished" podID="fc24faa8-1959-4a00-9859-80c7fb19bea3" containerID="aa4ab0cddcdb3db70b6748ce0b92be715383789bec8a5a8c7d81c6760cde678d" exitCode=0 Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.682020 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4x5w6" event={"ID":"fc24faa8-1959-4a00-9859-80c7fb19bea3","Type":"ContainerDied","Data":"aa4ab0cddcdb3db70b6748ce0b92be715383789bec8a5a8c7d81c6760cde678d"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.687758 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067bc019-6975-420a-bc5c-0cee9f5ad72f","Type":"ContainerStarted","Data":"d2fac7a0efb20f95ac6803c36338bd11b6757c7a44c4824fa3e0cc7ee866bb01"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.691588 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e","Type":"ContainerStarted","Data":"2c3afb99acac8da6be247f4e78dd5e156fae54c5d7c0e7ce6d98135dc920bfd4"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.694885 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k" event={"ID":"7f0f1cbe-76e7-455a-80da-05602295973b","Type":"ContainerStarted","Data":"18449c03707a2e634d968e51e99222d0b752b4b34663a5bd49acede1ecc56943"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.695349 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-fw66k" Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.697090 4546 generic.go:334] "Generic (PLEG): container finished" podID="a0675e35-afd6-4c93-a1fa-6a3ec2dd1190" containerID="2c541cfca6e3e0f0481fd04a0c98f05c631601b788a6de5b28fa5346f3641364" exitCode=0 Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.697170 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190","Type":"ContainerDied","Data":"2c541cfca6e3e0f0481fd04a0c98f05c631601b788a6de5b28fa5346f3641364"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.703715 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2025b014-b533-402c-af2f-179c921eb503","Type":"ContainerStarted","Data":"2cce23fb10c530aeeae1243bd0014b913988717447e279586ec6c00065e51c59"} Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.754016 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fw66k" podStartSLOduration=19.197941178 podStartE2EDuration="24.753999292s" podCreationTimestamp="2026-02-01 06:55:25 +0000 UTC" firstStartedPulling="2026-02-01 06:55:43.334009922 +0000 UTC m=+773.984945938" lastFinishedPulling="2026-02-01 06:55:48.890068036 +0000 UTC m=+779.541004052" observedRunningTime="2026-02-01 06:55:49.748068111 +0000 UTC m=+780.399004126" watchObservedRunningTime="2026-02-01 06:55:49.753999292 +0000 UTC m=+780.404935328" Feb 01 06:55:49 crc kubenswrapper[4546]: I0201 06:55:49.784039 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=9.177012267 podStartE2EDuration="31.784025222s" podCreationTimestamp="2026-02-01 06:55:18 +0000 UTC" firstStartedPulling="2026-02-01 06:55:20.872185734 +0000 UTC m=+751.523121750" lastFinishedPulling="2026-02-01 06:55:43.479198689 +0000 UTC m=+774.130134705" observedRunningTime="2026-02-01 06:55:49.781018644 +0000 UTC m=+780.431954660" watchObservedRunningTime="2026-02-01 06:55:49.784025222 +0000 UTC m=+780.434961227" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.298271 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.298507 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.718328 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a0675e35-afd6-4c93-a1fa-6a3ec2dd1190","Type":"ContainerStarted","Data":"87d17263a0155298d90be9447c587fbf4036bd6ac3300eec01880880141d95bd"} Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.726457 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4x5w6" event={"ID":"fc24faa8-1959-4a00-9859-80c7fb19bea3","Type":"ContainerStarted","Data":"3b9edc7a9a853959814ff0d8f9ec276a4313a42e5764f562b43ca415e5a07e56"} Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.726495 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4x5w6" event={"ID":"fc24faa8-1959-4a00-9859-80c7fb19bea3","Type":"ContainerStarted","Data":"ca9c7d1725e50420ddc7fbcda83818c5c6c6afad38a3e93bbf701a68488dfdb2"} Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.726956 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.746722 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.80612474 podStartE2EDuration="33.746708849s" podCreationTimestamp="2026-02-01 06:55:17 +0000 UTC" firstStartedPulling="2026-02-01 06:55:19.538659404 +0000 UTC m=+750.189595410" lastFinishedPulling="2026-02-01 06:55:43.479243502 +0000 UTC m=+774.130179519" observedRunningTime="2026-02-01 06:55:50.737281589 +0000 UTC m=+781.388217605" watchObservedRunningTime="2026-02-01 06:55:50.746708849 +0000 UTC m=+781.397644866" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.749732 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.760094 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4x5w6" podStartSLOduration=20.24497612 podStartE2EDuration="25.760082759s" podCreationTimestamp="2026-02-01 06:55:25 +0000 UTC" firstStartedPulling="2026-02-01 06:55:43.3504702 +0000 UTC m=+774.001406216" lastFinishedPulling="2026-02-01 06:55:48.865576839 +0000 UTC m=+779.516512855" observedRunningTime="2026-02-01 06:55:50.759502837 +0000 UTC m=+781.410438853" watchObservedRunningTime="2026-02-01 06:55:50.760082759 +0000 UTC m=+781.411018775" Feb 01 06:55:50 crc kubenswrapper[4546]: I0201 06:55:50.824348 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.358016 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.735155 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2025b014-b533-402c-af2f-179c921eb503","Type":"ContainerStarted","Data":"7fe311a58e1607f8fdae39ac4f1b10a15eed82c4470350f27e285217016fbdcc"} Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.739289 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8bb3eee7-1aea-4312-a07b-9d4e4a3bc04e","Type":"ContainerStarted","Data":"65f52469cbf1701231873beff8aba37fe9b12dd58bac39a730104899c147d991"} Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.782704 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.035651455 podStartE2EDuration="23.782677672s" podCreationTimestamp="2026-02-01 06:55:28 +0000 UTC" firstStartedPulling="2026-02-01 06:55:43.333515831 +0000 UTC m=+773.984451847" lastFinishedPulling="2026-02-01 06:55:51.080542048 +0000 UTC m=+781.731478064" observedRunningTime="2026-02-01 06:55:51.779029004 +0000 UTC m=+782.429965020" watchObservedRunningTime="2026-02-01 06:55:51.782677672 +0000 UTC m=+782.433613688" Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.784899 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.785998 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.035412655 podStartE2EDuration="26.785985358s" podCreationTimestamp="2026-02-01 06:55:25 +0000 UTC" firstStartedPulling="2026-02-01 06:55:43.333944939 +0000 UTC m=+773.984880955" lastFinishedPulling="2026-02-01 06:55:51.084517642 +0000 UTC m=+781.735453658" observedRunningTime="2026-02-01 06:55:51.760505818 +0000 UTC m=+782.411441824" watchObservedRunningTime="2026-02-01 06:55:51.785985358 +0000 UTC m=+782.436921374" Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.853669 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:51 crc kubenswrapper[4546]: I0201 06:55:51.853926 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668487f585-lq2g4" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="dnsmasq-dns" containerID="cri-o://af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0" gracePeriod=10 Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.228586 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.282227 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc\") pod \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.282492 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config\") pod \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.282539 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96q52\" (UniqueName: \"kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52\") pod \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\" (UID: \"dec7c8ef-e98a-4313-b7b6-669c3b9217b9\") " Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.291039 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52" (OuterVolumeSpecName: "kube-api-access-96q52") pod "dec7c8ef-e98a-4313-b7b6-669c3b9217b9" (UID: "dec7c8ef-e98a-4313-b7b6-669c3b9217b9"). InnerVolumeSpecName "kube-api-access-96q52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.316590 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config" (OuterVolumeSpecName: "config") pod "dec7c8ef-e98a-4313-b7b6-669c3b9217b9" (UID: "dec7c8ef-e98a-4313-b7b6-669c3b9217b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.335099 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dec7c8ef-e98a-4313-b7b6-669c3b9217b9" (UID: "dec7c8ef-e98a-4313-b7b6-669c3b9217b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.384841 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.384886 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.384899 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96q52\" (UniqueName: \"kubernetes.io/projected/dec7c8ef-e98a-4313-b7b6-669c3b9217b9-kube-api-access-96q52\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.752121 4546 generic.go:334] "Generic (PLEG): container finished" podID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerID="af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0" exitCode=0 Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.752216 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668487f585-lq2g4" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.752211 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668487f585-lq2g4" event={"ID":"dec7c8ef-e98a-4313-b7b6-669c3b9217b9","Type":"ContainerDied","Data":"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0"} Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.752300 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668487f585-lq2g4" event={"ID":"dec7c8ef-e98a-4313-b7b6-669c3b9217b9","Type":"ContainerDied","Data":"6d482e3f9bdce8c15480b46f9cef0b7e684a221519ae440422d420940466f0b0"} Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.752342 4546 scope.go:117] "RemoveContainer" containerID="af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.782698 4546 scope.go:117] "RemoveContainer" containerID="6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.804678 4546 scope.go:117] "RemoveContainer" containerID="af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0" Feb 01 06:55:52 crc kubenswrapper[4546]: E0201 06:55:52.804944 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0\": container with ID starting with af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0 not found: ID does not exist" containerID="af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.804974 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0"} err="failed to get container status \"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0\": rpc error: code = NotFound desc = could not find container \"af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0\": container with ID starting with af79eeadb1f6be530f65fcbc254a6469b0d6aa7ff2ad741ed731c36b3564c9c0 not found: ID does not exist" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.804992 4546 scope.go:117] "RemoveContainer" containerID="6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057" Feb 01 06:55:52 crc kubenswrapper[4546]: E0201 06:55:52.805163 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057\": container with ID starting with 6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057 not found: ID does not exist" containerID="6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.805186 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057"} err="failed to get container status \"6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057\": rpc error: code = NotFound desc = could not find container \"6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057\": container with ID starting with 6c5cfcaffb36082b07304a878557810b93a1624651b8a997e2b2e2ea6d8d4057 not found: ID does not exist" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.884935 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.923419 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668487f585-lq2g4"] Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.979625 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:52 crc kubenswrapper[4546]: E0201 06:55:52.979948 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="dnsmasq-dns" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.979959 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="dnsmasq-dns" Feb 01 06:55:52 crc kubenswrapper[4546]: E0201 06:55:52.979984 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="init" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.979990 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="init" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.980110 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" containerName="dnsmasq-dns" Feb 01 06:55:52 crc kubenswrapper[4546]: I0201 06:55:52.980847 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.007769 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.101884 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbn6\" (UniqueName: \"kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.101970 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.102088 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.203519 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbn6\" (UniqueName: \"kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.203570 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.203588 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.204470 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.204603 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.222487 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbn6\" (UniqueName: \"kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6\") pod \"dnsmasq-dns-677b7d6c7c-g8vfw\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.299368 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.665020 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec7c8ef-e98a-4313-b7b6-669c3b9217b9" path="/var/lib/kubelet/pods/dec7c8ef-e98a-4313-b7b6-669c3b9217b9/volumes" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.669487 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.702589 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.757899 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:53 crc kubenswrapper[4546]: I0201 06:55:53.758905 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.221476 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.226066 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.228356 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.228630 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.228739 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-kc74n" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.229039 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.243736 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.322968 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-lock\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.323041 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.323262 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.323359 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-cache\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.323451 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f597d1-7dec-4229-9b1c-eebfb6958694-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.323492 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckvp\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-kube-api-access-zckvp\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.324488 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.356201 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.425702 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.426195 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.426265 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.426344 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-cache\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.426373 4546 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.426389 4546 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.426446 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift podName:00f597d1-7dec-4229-9b1c-eebfb6958694 nodeName:}" failed. No retries permitted until 2026-02-01 06:55:54.926427818 +0000 UTC m=+785.577363834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift") pod "swift-storage-0" (UID: "00f597d1-7dec-4229-9b1c-eebfb6958694") : configmap "swift-ring-files" not found Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.426914 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-cache\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.427056 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f597d1-7dec-4229-9b1c-eebfb6958694-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.427091 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckvp\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-kube-api-access-zckvp\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.427145 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-lock\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.427706 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/00f597d1-7dec-4229-9b1c-eebfb6958694-lock\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.432356 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00f597d1-7dec-4229-9b1c-eebfb6958694-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.442017 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckvp\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-kube-api-access-zckvp\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.442902 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.774217 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" event={"ID":"a00a3212-cd90-482d-b7c9-b65221423a1f","Type":"ContainerStarted","Data":"22b18489e8d7179e4549d26e854005a3c5f21fb918c78edbca0a46567a04fa44"} Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.774954 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.803404 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.805034 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 01 06:55:54 crc kubenswrapper[4546]: I0201 06:55:54.937740 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.938597 4546 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.938621 4546 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 06:55:54 crc kubenswrapper[4546]: E0201 06:55:54.938672 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift podName:00f597d1-7dec-4229-9b1c-eebfb6958694 nodeName:}" failed. No retries permitted until 2026-02-01 06:55:55.938655247 +0000 UTC m=+786.589591263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift") pod "swift-storage-0" (UID: "00f597d1-7dec-4229-9b1c-eebfb6958694") : configmap "swift-ring-files" not found Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.046474 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.078329 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fc6bb6ff-47lsc"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.079733 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.085715 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.100526 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gdwmt"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.113795 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.120667 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.139074 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fc6bb6ff-47lsc"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.143992 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.144218 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.144322 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.144406 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2548\" (UniqueName: \"kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.174489 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gdwmt"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.244659 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fc6bb6ff-47lsc"] Feb 01 06:55:55 crc kubenswrapper[4546]: E0201 06:55:55.245514 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-n2548 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" podUID="486beee8-03da-49ec-b08f-d2cac8a8193b" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.245834 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovn-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.245888 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovs-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.245923 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4zq\" (UniqueName: \"kubernetes.io/projected/9c5ee162-6d5a-4d10-916f-9346ded07c71-kube-api-access-rz4zq\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.245981 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246004 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5ee162-6d5a-4d10-916f-9346ded07c71-config\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246029 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-combined-ca-bundle\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246093 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246152 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246191 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2548\" (UniqueName: \"kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.246223 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.247084 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.247477 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.247660 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.262206 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2548\" (UniqueName: \"kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548\") pod \"dnsmasq-dns-fc6bb6ff-47lsc\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.268192 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.269604 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.275745 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.290906 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349183 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349245 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovn-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349265 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovs-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349292 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4zq\" (UniqueName: \"kubernetes.io/projected/9c5ee162-6d5a-4d10-916f-9346ded07c71-kube-api-access-rz4zq\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349350 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgdv7\" (UniqueName: \"kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349380 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349408 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5ee162-6d5a-4d10-916f-9346ded07c71-config\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349445 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-combined-ca-bundle\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349472 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349488 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.349571 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.351289 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovn-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.351346 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9c5ee162-6d5a-4d10-916f-9346ded07c71-ovs-rundir\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.360159 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.360943 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5ee162-6d5a-4d10-916f-9346ded07c71-config\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.361928 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.366800 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9fr9l" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.367073 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.367208 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.367327 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.368451 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.377762 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-combined-ca-bundle\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.394560 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4zq\" (UniqueName: \"kubernetes.io/projected/9c5ee162-6d5a-4d10-916f-9346ded07c71-kube-api-access-rz4zq\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.394923 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5ee162-6d5a-4d10-916f-9346ded07c71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gdwmt\" (UID: \"9c5ee162-6d5a-4d10-916f-9346ded07c71\") " pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451240 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-scripts\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451644 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkcwd\" (UniqueName: \"kubernetes.io/projected/820da745-0a82-4e3b-8554-9dc02c2ea4b2-kube-api-access-kkcwd\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451708 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451746 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451832 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451934 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-config\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.451999 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgdv7\" (UniqueName: \"kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452032 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452078 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452158 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452178 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452509 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452815 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452911 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.452197 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.453138 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.469642 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgdv7\" (UniqueName: \"kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7\") pod \"dnsmasq-dns-6db666964f-97rr4\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.473042 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gdwmt" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.554342 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-config\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.554625 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.554725 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.554831 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-scripts\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.554926 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkcwd\" (UniqueName: \"kubernetes.io/projected/820da745-0a82-4e3b-8554-9dc02c2ea4b2-kube-api-access-kkcwd\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.555008 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.555436 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.555630 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.555653 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-scripts\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.556310 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820da745-0a82-4e3b-8554-9dc02c2ea4b2-config\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.561668 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.562336 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.562526 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820da745-0a82-4e3b-8554-9dc02c2ea4b2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.578829 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkcwd\" (UniqueName: \"kubernetes.io/projected/820da745-0a82-4e3b-8554-9dc02c2ea4b2-kube-api-access-kkcwd\") pod \"ovn-northd-0\" (UID: \"820da745-0a82-4e3b-8554-9dc02c2ea4b2\") " pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.610518 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.739813 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.781188 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.794627 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.860740 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2548\" (UniqueName: \"kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548\") pod \"486beee8-03da-49ec-b08f-d2cac8a8193b\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.860943 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb\") pod \"486beee8-03da-49ec-b08f-d2cac8a8193b\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.861029 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config\") pod \"486beee8-03da-49ec-b08f-d2cac8a8193b\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.861080 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc\") pod \"486beee8-03da-49ec-b08f-d2cac8a8193b\" (UID: \"486beee8-03da-49ec-b08f-d2cac8a8193b\") " Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.866955 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548" (OuterVolumeSpecName: "kube-api-access-n2548") pod "486beee8-03da-49ec-b08f-d2cac8a8193b" (UID: "486beee8-03da-49ec-b08f-d2cac8a8193b"). InnerVolumeSpecName "kube-api-access-n2548". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.882017 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config" (OuterVolumeSpecName: "config") pod "486beee8-03da-49ec-b08f-d2cac8a8193b" (UID: "486beee8-03da-49ec-b08f-d2cac8a8193b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.882030 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "486beee8-03da-49ec-b08f-d2cac8a8193b" (UID: "486beee8-03da-49ec-b08f-d2cac8a8193b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.882345 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "486beee8-03da-49ec-b08f-d2cac8a8193b" (UID: "486beee8-03da-49ec-b08f-d2cac8a8193b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.910376 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gdwmt"] Feb 01 06:55:55 crc kubenswrapper[4546]: W0201 06:55:55.922422 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5ee162_6d5a_4d10_916f_9346ded07c71.slice/crio-a1018a78951ba39c8a260f06019f74fb95c111bdc78c7b4bad9eb69abc6f6919 WatchSource:0}: Error finding container a1018a78951ba39c8a260f06019f74fb95c111bdc78c7b4bad9eb69abc6f6919: Status 404 returned error can't find the container with id a1018a78951ba39c8a260f06019f74fb95c111bdc78c7b4bad9eb69abc6f6919 Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.965969 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:55 crc kubenswrapper[4546]: E0201 06:55:55.966128 4546 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 06:55:55 crc kubenswrapper[4546]: E0201 06:55:55.966152 4546 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 06:55:55 crc kubenswrapper[4546]: E0201 06:55:55.966210 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift podName:00f597d1-7dec-4229-9b1c-eebfb6958694 nodeName:}" failed. No retries permitted until 2026-02-01 06:55:57.966192747 +0000 UTC m=+788.617128763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift") pod "swift-storage-0" (UID: "00f597d1-7dec-4229-9b1c-eebfb6958694") : configmap "swift-ring-files" not found Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.967361 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2548\" (UniqueName: \"kubernetes.io/projected/486beee8-03da-49ec-b08f-d2cac8a8193b-kube-api-access-n2548\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.967411 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.967424 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.967436 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/486beee8-03da-49ec-b08f-d2cac8a8193b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:55 crc kubenswrapper[4546]: I0201 06:55:55.971686 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.016724 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.406303 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.475618 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.815316 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"820da745-0a82-4e3b-8554-9dc02c2ea4b2","Type":"ContainerStarted","Data":"9b17f2c8b0d1380d398a9939781e26a36a9849e9704e8c2f280d3b7734798c6e"} Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.821890 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fc6bb6ff-47lsc" Feb 01 06:55:56 crc kubenswrapper[4546]: I0201 06:55:56.821976 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gdwmt" event={"ID":"9c5ee162-6d5a-4d10-916f-9346ded07c71","Type":"ContainerStarted","Data":"a1018a78951ba39c8a260f06019f74fb95c111bdc78c7b4bad9eb69abc6f6919"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.015959 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fc6bb6ff-47lsc"] Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.052926 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fc6bb6ff-47lsc"] Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.666255 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486beee8-03da-49ec-b08f-d2cac8a8193b" path="/var/lib/kubelet/pods/486beee8-03da-49ec-b08f-d2cac8a8193b/volumes" Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.839176 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa29cd22-5996-4415-92c9-8012caf2dcfb","Type":"ContainerStarted","Data":"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.840656 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.840761 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gdwmt" event={"ID":"9c5ee162-6d5a-4d10-916f-9346ded07c71","Type":"ContainerStarted","Data":"851a001e9fdeca1e0486e0519cd11a6433500a77fc935ec18bd2301cf09df952"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.845652 4546 generic.go:334] "Generic (PLEG): container finished" podID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerID="1d488405bad77d40bd293106247f4d348d0b9cec9baa599ea3c7b30c45f30da1" exitCode=0 Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.845721 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db666964f-97rr4" event={"ID":"8850f5c5-318a-476c-8125-55bfcdc24d8b","Type":"ContainerDied","Data":"1d488405bad77d40bd293106247f4d348d0b9cec9baa599ea3c7b30c45f30da1"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.845792 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db666964f-97rr4" event={"ID":"8850f5c5-318a-476c-8125-55bfcdc24d8b","Type":"ContainerStarted","Data":"d826fc6c7c0c4e7ddcf88e9b79607b0281abcf3005561028e51af05ad8fe7b13"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.857256 4546 generic.go:334] "Generic (PLEG): container finished" podID="a00a3212-cd90-482d-b7c9-b65221423a1f" containerID="766d1caa1ec5d03d16344896534cc034300d21e8b8eee073e154b17c34830bb9" exitCode=0 Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.857315 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" event={"ID":"a00a3212-cd90-482d-b7c9-b65221423a1f","Type":"ContainerDied","Data":"766d1caa1ec5d03d16344896534cc034300d21e8b8eee073e154b17c34830bb9"} Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.898105 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.997122557 podStartE2EDuration="35.898091093s" podCreationTimestamp="2026-02-01 06:55:22 +0000 UTC" firstStartedPulling="2026-02-01 06:55:23.1744737 +0000 UTC m=+753.825409717" lastFinishedPulling="2026-02-01 06:55:57.075442237 +0000 UTC m=+787.726378253" observedRunningTime="2026-02-01 06:55:57.856695791 +0000 UTC m=+788.507631808" watchObservedRunningTime="2026-02-01 06:55:57.898091093 +0000 UTC m=+788.549027109" Feb 01 06:55:57 crc kubenswrapper[4546]: I0201 06:55:57.912850 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gdwmt" podStartSLOduration=2.912828454 podStartE2EDuration="2.912828454s" podCreationTimestamp="2026-02-01 06:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:55:57.905905695 +0000 UTC m=+788.556841711" watchObservedRunningTime="2026-02-01 06:55:57.912828454 +0000 UTC m=+788.563764470" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.011215 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-86qhj"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.012330 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.021181 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:55:58 crc kubenswrapper[4546]: E0201 06:55:58.021440 4546 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 06:55:58 crc kubenswrapper[4546]: E0201 06:55:58.021456 4546 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 06:55:58 crc kubenswrapper[4546]: E0201 06:55:58.021501 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift podName:00f597d1-7dec-4229-9b1c-eebfb6958694 nodeName:}" failed. No retries permitted until 2026-02-01 06:56:02.021486548 +0000 UTC m=+792.672422555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift") pod "swift-storage-0" (UID: "00f597d1-7dec-4229-9b1c-eebfb6958694") : configmap "swift-ring-files" not found Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.033614 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.033803 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.034009 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.046652 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-86qhj"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.070473 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-86qhj"] Feb 01 06:55:58 crc kubenswrapper[4546]: E0201 06:55:58.071274 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-pl8wx ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-pl8wx ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-86qhj" podUID="a1b2be11-39c9-4289-b40f-331a9dec6253" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.090001 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mhf5x"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.091375 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.099557 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mhf5x"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124138 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124336 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124374 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124423 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8wx\" (UniqueName: \"kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124448 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124524 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.124549 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.227689 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.227797 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.227966 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8p5z\" (UniqueName: \"kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228011 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228093 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228117 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228146 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228204 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8wx\" (UniqueName: \"kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228252 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228283 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228301 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.229342 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.229431 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.229480 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.228776 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.231586 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.231703 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.238183 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.238550 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.240106 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.240663 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.245203 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8wx\" (UniqueName: \"kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx\") pod \"swift-ring-rebalance-86qhj\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330166 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc\") pod \"a00a3212-cd90-482d-b7c9-b65221423a1f\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330333 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config\") pod \"a00a3212-cd90-482d-b7c9-b65221423a1f\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330385 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbn6\" (UniqueName: \"kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6\") pod \"a00a3212-cd90-482d-b7c9-b65221423a1f\" (UID: \"a00a3212-cd90-482d-b7c9-b65221423a1f\") " Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330836 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330879 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330925 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.330957 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.331005 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8p5z\" (UniqueName: \"kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.331048 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.331068 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.331782 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.332917 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.333160 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.333785 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6" (OuterVolumeSpecName: "kube-api-access-xdbn6") pod "a00a3212-cd90-482d-b7c9-b65221423a1f" (UID: "a00a3212-cd90-482d-b7c9-b65221423a1f"). InnerVolumeSpecName "kube-api-access-xdbn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.337566 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.338633 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.344165 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.347136 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8p5z\" (UniqueName: \"kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z\") pod \"swift-ring-rebalance-mhf5x\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.349538 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a00a3212-cd90-482d-b7c9-b65221423a1f" (UID: "a00a3212-cd90-482d-b7c9-b65221423a1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.364031 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config" (OuterVolumeSpecName: "config") pod "a00a3212-cd90-482d-b7c9-b65221423a1f" (UID: "a00a3212-cd90-482d-b7c9-b65221423a1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.428597 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.431955 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.431977 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a3212-cd90-482d-b7c9-b65221423a1f-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.431987 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbn6\" (UniqueName: \"kubernetes.io/projected/a00a3212-cd90-482d-b7c9-b65221423a1f-kube-api-access-xdbn6\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.874941 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"820da745-0a82-4e3b-8554-9dc02c2ea4b2","Type":"ContainerStarted","Data":"2402cc6e871f67032c180cf7d7c4ec95b5a8a1478ac25b2fe53cad394b835437"} Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.875489 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"820da745-0a82-4e3b-8554-9dc02c2ea4b2","Type":"ContainerStarted","Data":"2dacee05af1f8de9ad50cb0b0dccf4364a1e5a03e925852750fab0d83574933f"} Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.875538 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.886606 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db666964f-97rr4" event={"ID":"8850f5c5-318a-476c-8125-55bfcdc24d8b","Type":"ContainerStarted","Data":"883e7ee92dc9c006c2ee3b5023b11b62bca942ddbb338cc21935a5813481af6a"} Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.887183 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.889881 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" event={"ID":"a00a3212-cd90-482d-b7c9-b65221423a1f","Type":"ContainerDied","Data":"22b18489e8d7179e4549d26e854005a3c5f21fb918c78edbca0a46567a04fa44"} Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.889939 4546 scope.go:117] "RemoveContainer" containerID="766d1caa1ec5d03d16344896534cc034300d21e8b8eee073e154b17c34830bb9" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.890108 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b7d6c7c-g8vfw" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.890520 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.907876 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mhf5x"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.928456 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.934604 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.244560352 podStartE2EDuration="3.934580047s" podCreationTimestamp="2026-02-01 06:55:55 +0000 UTC" firstStartedPulling="2026-02-01 06:55:56.798979414 +0000 UTC m=+787.449915430" lastFinishedPulling="2026-02-01 06:55:58.488999108 +0000 UTC m=+789.139935125" observedRunningTime="2026-02-01 06:55:58.907533984 +0000 UTC m=+789.558470001" watchObservedRunningTime="2026-02-01 06:55:58.934580047 +0000 UTC m=+789.585516063" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.941045 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6db666964f-97rr4" podStartSLOduration=3.94103269 podStartE2EDuration="3.94103269s" podCreationTimestamp="2026-02-01 06:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:55:58.939874006 +0000 UTC m=+789.590810022" watchObservedRunningTime="2026-02-01 06:55:58.94103269 +0000 UTC m=+789.591968706" Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.986549 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:58 crc kubenswrapper[4546]: I0201 06:55:58.992960 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-677b7d6c7c-g8vfw"] Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.013694 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.013752 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.049949 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl8wx\" (UniqueName: \"kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050007 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050077 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050164 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050181 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050253 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050289 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift\") pod \"a1b2be11-39c9-4289-b40f-331a9dec6253\" (UID: \"a1b2be11-39c9-4289-b40f-331a9dec6253\") " Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.050920 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.057493 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.057870 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts" (OuterVolumeSpecName: "scripts") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.058841 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx" (OuterVolumeSpecName: "kube-api-access-pl8wx") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "kube-api-access-pl8wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.059272 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.059527 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.060040 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1b2be11-39c9-4289-b40f-331a9dec6253" (UID: "a1b2be11-39c9-4289-b40f-331a9dec6253"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.070754 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-25ztg"] Feb 01 06:55:59 crc kubenswrapper[4546]: E0201 06:55:59.071224 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00a3212-cd90-482d-b7c9-b65221423a1f" containerName="init" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.071250 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00a3212-cd90-482d-b7c9-b65221423a1f" containerName="init" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.071433 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00a3212-cd90-482d-b7c9-b65221423a1f" containerName="init" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.072102 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.075633 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.082665 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-25ztg"] Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.125933 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.151938 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mlf4\" (UniqueName: \"kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152491 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152583 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl8wx\" (UniqueName: \"kubernetes.io/projected/a1b2be11-39c9-4289-b40f-331a9dec6253-kube-api-access-pl8wx\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152597 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152607 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152617 4546 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152659 4546 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a1b2be11-39c9-4289-b40f-331a9dec6253-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152677 4546 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a1b2be11-39c9-4289-b40f-331a9dec6253-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.152688 4546 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a1b2be11-39c9-4289-b40f-331a9dec6253-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.255242 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.255517 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mlf4\" (UniqueName: \"kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.256166 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.272757 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mlf4\" (UniqueName: \"kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4\") pod \"root-account-create-update-25ztg\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.386913 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25ztg" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.668102 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00a3212-cd90-482d-b7c9-b65221423a1f" path="/var/lib/kubelet/pods/a00a3212-cd90-482d-b7c9-b65221423a1f/volumes" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.902749 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhf5x" event={"ID":"38bd4178-697d-4013-a31e-573c439c9517","Type":"ContainerStarted","Data":"38644f846f81357a421773572c0eab925d85a67b559e506b4531c9daf3b078dc"} Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.902913 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-86qhj" Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.910893 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-25ztg"] Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.957052 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-86qhj"] Feb 01 06:55:59 crc kubenswrapper[4546]: I0201 06:55:59.964161 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-86qhj"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.008230 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.244347 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8d7b7"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.248517 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8d7b7"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.248611 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.278648 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.278747 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9qn\" (UniqueName: \"kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.356424 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8fd9-account-create-update-cgcz9"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.357643 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.366227 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.376886 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8fd9-account-create-update-cgcz9"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.381180 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.381295 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.381359 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9qn\" (UniqueName: \"kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.381399 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpnjl\" (UniqueName: \"kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.382113 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.399367 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9qn\" (UniqueName: \"kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn\") pod \"keystone-db-create-8d7b7\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.484446 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.484811 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpnjl\" (UniqueName: \"kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.485294 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.506837 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpnjl\" (UniqueName: \"kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl\") pod \"keystone-8fd9-account-create-update-cgcz9\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.558172 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rsx9q"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.561307 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.568813 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rsx9q"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.590926 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzh6\" (UniqueName: \"kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.591152 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.592818 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.676577 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.686724 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3734-account-create-update-h67dn"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.689487 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.692452 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzh6\" (UniqueName: \"kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.692630 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.694735 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.695236 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.714184 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3734-account-create-update-h67dn"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.719148 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzh6\" (UniqueName: \"kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6\") pod \"placement-db-create-rsx9q\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.794992 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.795473 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7m4\" (UniqueName: \"kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.876801 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.897665 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt7m4\" (UniqueName: \"kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.897776 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.900120 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.917258 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hs7bh"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.919520 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.928380 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hs7bh"] Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.929520 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt7m4\" (UniqueName: \"kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4\") pod \"placement-3734-account-create-update-h67dn\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.940830 4546 generic.go:334] "Generic (PLEG): container finished" podID="e6fecad8-69a3-4252-b812-d1d6f0422691" containerID="e9354b7a901cca0c83049b71242cddc2500c450838deed30777b03f96b810771" exitCode=0 Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.941028 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25ztg" event={"ID":"e6fecad8-69a3-4252-b812-d1d6f0422691","Type":"ContainerDied","Data":"e9354b7a901cca0c83049b71242cddc2500c450838deed30777b03f96b810771"} Feb 01 06:56:00 crc kubenswrapper[4546]: I0201 06:56:00.941097 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25ztg" event={"ID":"e6fecad8-69a3-4252-b812-d1d6f0422691","Type":"ContainerStarted","Data":"67a8bc36dbadeec1b7aef0dbe474e7b22c5fc94c75952083b336e6ae54c3aa5b"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.011012 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.011533 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5j59\" (UniqueName: \"kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.035259 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.056054 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7ee6-account-create-update-pn854"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.057323 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.060059 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.082032 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7ee6-account-create-update-pn854"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.101616 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8d7b7"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.113461 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2tcp\" (UniqueName: \"kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.113518 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5j59\" (UniqueName: \"kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.113613 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.113681 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.115698 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.131130 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5j59\" (UniqueName: \"kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59\") pod \"glance-db-create-hs7bh\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.215333 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.215486 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2tcp\" (UniqueName: \"kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.216484 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.219210 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8fd9-account-create-update-cgcz9"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.232584 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2tcp\" (UniqueName: \"kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp\") pod \"glance-7ee6-account-create-update-pn854\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.238498 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.392159 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.394745 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rsx9q"] Feb 01 06:56:01 crc kubenswrapper[4546]: W0201 06:56:01.419708 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31ff518a_27b9_4b86_a833_9e45e65104e2.slice/crio-096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15 WatchSource:0}: Error finding container 096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15: Status 404 returned error can't find the container with id 096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15 Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.526285 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3734-account-create-update-h67dn"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.674136 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b2be11-39c9-4289-b40f-331a9dec6253" path="/var/lib/kubelet/pods/a1b2be11-39c9-4289-b40f-331a9dec6253/volumes" Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.773775 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hs7bh"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.925421 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7ee6-account-create-update-pn854"] Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.950444 4546 generic.go:334] "Generic (PLEG): container finished" podID="5724b55b-9e2b-481e-8850-95a521a27999" containerID="ffae3c82fb2025796d0dd09d24e6ada2b325abe3a9b385877a082ad57ca51dfd" exitCode=0 Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.950552 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3734-account-create-update-h67dn" event={"ID":"5724b55b-9e2b-481e-8850-95a521a27999","Type":"ContainerDied","Data":"ffae3c82fb2025796d0dd09d24e6ada2b325abe3a9b385877a082ad57ca51dfd"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.950591 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3734-account-create-update-h67dn" event={"ID":"5724b55b-9e2b-481e-8850-95a521a27999","Type":"ContainerStarted","Data":"9dc5b8aa3ee0e0cc0e9bee51011f0fa16e567ac143957f74ed72f26c087d1891"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.952291 4546 generic.go:334] "Generic (PLEG): container finished" podID="c30dadf5-37b7-47b9-883f-556482c1e8d0" containerID="2db4c7f532485ba6e41af2dbb05cb3e029aca30230c80ab58fdfbd775a81cdb5" exitCode=0 Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.952387 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8d7b7" event={"ID":"c30dadf5-37b7-47b9-883f-556482c1e8d0","Type":"ContainerDied","Data":"2db4c7f532485ba6e41af2dbb05cb3e029aca30230c80ab58fdfbd775a81cdb5"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.952435 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8d7b7" event={"ID":"c30dadf5-37b7-47b9-883f-556482c1e8d0","Type":"ContainerStarted","Data":"db0b62e276e0f037ccbe9f94ba84f820fec8ecbf20cd8151cc274052cefea17d"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.953825 4546 generic.go:334] "Generic (PLEG): container finished" podID="31ff518a-27b9-4b86-a833-9e45e65104e2" containerID="6a2bc8eaca085f28b306ca7616e42e02eb91939c79167a0f97c847eb5e8adaed" exitCode=0 Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.953900 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rsx9q" event={"ID":"31ff518a-27b9-4b86-a833-9e45e65104e2","Type":"ContainerDied","Data":"6a2bc8eaca085f28b306ca7616e42e02eb91939c79167a0f97c847eb5e8adaed"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.953918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rsx9q" event={"ID":"31ff518a-27b9-4b86-a833-9e45e65104e2","Type":"ContainerStarted","Data":"096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.955044 4546 generic.go:334] "Generic (PLEG): container finished" podID="4de11318-8cb2-44c8-ab01-41be6b3cd1c8" containerID="cbd618404a3c0c640ef1da47a81380e517d96a20c1dcbeed65fab5d8c94da8aa" exitCode=0 Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.955396 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd9-account-create-update-cgcz9" event={"ID":"4de11318-8cb2-44c8-ab01-41be6b3cd1c8","Type":"ContainerDied","Data":"cbd618404a3c0c640ef1da47a81380e517d96a20c1dcbeed65fab5d8c94da8aa"} Feb 01 06:56:01 crc kubenswrapper[4546]: I0201 06:56:01.955451 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd9-account-create-update-cgcz9" event={"ID":"4de11318-8cb2-44c8-ab01-41be6b3cd1c8","Type":"ContainerStarted","Data":"cb250ce304a406527df40b004c78059ef2f984dc0b44f1423363e0ac16014df1"} Feb 01 06:56:02 crc kubenswrapper[4546]: I0201 06:56:02.030264 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:56:02 crc kubenswrapper[4546]: E0201 06:56:02.030498 4546 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 06:56:02 crc kubenswrapper[4546]: E0201 06:56:02.030519 4546 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 06:56:02 crc kubenswrapper[4546]: E0201 06:56:02.030581 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift podName:00f597d1-7dec-4229-9b1c-eebfb6958694 nodeName:}" failed. No retries permitted until 2026-02-01 06:56:10.030564864 +0000 UTC m=+800.681500870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift") pod "swift-storage-0" (UID: "00f597d1-7dec-4229-9b1c-eebfb6958694") : configmap "swift-ring-files" not found Feb 01 06:56:02 crc kubenswrapper[4546]: I0201 06:56:02.895847 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 01 06:56:03 crc kubenswrapper[4546]: W0201 06:56:03.858446 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486e299d_c690_4f46_8197_e531c53a8b11.slice/crio-dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101 WatchSource:0}: Error finding container dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101: Status 404 returned error can't find the container with id dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101 Feb 01 06:56:03 crc kubenswrapper[4546]: W0201 06:56:03.860478 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode916bb26_02e3_4748_b195_0bb5bf550f71.slice/crio-08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26 WatchSource:0}: Error finding container 08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26: Status 404 returned error can't find the container with id 08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26 Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.976076 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hs7bh" event={"ID":"486e299d-c690-4f46-8197-e531c53a8b11","Type":"ContainerStarted","Data":"dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.978085 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3734-account-create-update-h67dn" event={"ID":"5724b55b-9e2b-481e-8850-95a521a27999","Type":"ContainerDied","Data":"9dc5b8aa3ee0e0cc0e9bee51011f0fa16e567ac143957f74ed72f26c087d1891"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.978151 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc5b8aa3ee0e0cc0e9bee51011f0fa16e567ac143957f74ed72f26c087d1891" Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.979417 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7ee6-account-create-update-pn854" event={"ID":"e916bb26-02e3-4748-b195-0bb5bf550f71","Type":"ContainerStarted","Data":"08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.980807 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8d7b7" event={"ID":"c30dadf5-37b7-47b9-883f-556482c1e8d0","Type":"ContainerDied","Data":"db0b62e276e0f037ccbe9f94ba84f820fec8ecbf20cd8151cc274052cefea17d"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.980843 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db0b62e276e0f037ccbe9f94ba84f820fec8ecbf20cd8151cc274052cefea17d" Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.982678 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rsx9q" event={"ID":"31ff518a-27b9-4b86-a833-9e45e65104e2","Type":"ContainerDied","Data":"096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.982712 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="096626ffc7114c93f4bcb4d1cdeba507d564b0bb8513ccc3235bb750a37fce15" Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.984079 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8fd9-account-create-update-cgcz9" event={"ID":"4de11318-8cb2-44c8-ab01-41be6b3cd1c8","Type":"ContainerDied","Data":"cb250ce304a406527df40b004c78059ef2f984dc0b44f1423363e0ac16014df1"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.984105 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb250ce304a406527df40b004c78059ef2f984dc0b44f1423363e0ac16014df1" Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.985454 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-25ztg" event={"ID":"e6fecad8-69a3-4252-b812-d1d6f0422691","Type":"ContainerDied","Data":"67a8bc36dbadeec1b7aef0dbe474e7b22c5fc94c75952083b336e6ae54c3aa5b"} Feb 01 06:56:03 crc kubenswrapper[4546]: I0201 06:56:03.985486 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67a8bc36dbadeec1b7aef0dbe474e7b22c5fc94c75952083b336e6ae54c3aa5b" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.041723 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25ztg" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.068974 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts\") pod \"e6fecad8-69a3-4252-b812-d1d6f0422691\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.069052 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mlf4\" (UniqueName: \"kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4\") pod \"e6fecad8-69a3-4252-b812-d1d6f0422691\" (UID: \"e6fecad8-69a3-4252-b812-d1d6f0422691\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.069555 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6fecad8-69a3-4252-b812-d1d6f0422691" (UID: "e6fecad8-69a3-4252-b812-d1d6f0422691"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.081300 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4" (OuterVolumeSpecName: "kube-api-access-9mlf4") pod "e6fecad8-69a3-4252-b812-d1d6f0422691" (UID: "e6fecad8-69a3-4252-b812-d1d6f0422691"). InnerVolumeSpecName "kube-api-access-9mlf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.081534 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.135285 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.156802 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.161963 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170586 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpnjl\" (UniqueName: \"kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl\") pod \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170646 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts\") pod \"5724b55b-9e2b-481e-8850-95a521a27999\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170725 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzh6\" (UniqueName: \"kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6\") pod \"31ff518a-27b9-4b86-a833-9e45e65104e2\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170783 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts\") pod \"31ff518a-27b9-4b86-a833-9e45e65104e2\" (UID: \"31ff518a-27b9-4b86-a833-9e45e65104e2\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170825 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k9qn\" (UniqueName: \"kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn\") pod \"c30dadf5-37b7-47b9-883f-556482c1e8d0\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.170960 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt7m4\" (UniqueName: \"kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4\") pod \"5724b55b-9e2b-481e-8850-95a521a27999\" (UID: \"5724b55b-9e2b-481e-8850-95a521a27999\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.171011 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts\") pod \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\" (UID: \"4de11318-8cb2-44c8-ab01-41be6b3cd1c8\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.171056 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts\") pod \"c30dadf5-37b7-47b9-883f-556482c1e8d0\" (UID: \"c30dadf5-37b7-47b9-883f-556482c1e8d0\") " Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.171964 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mlf4\" (UniqueName: \"kubernetes.io/projected/e6fecad8-69a3-4252-b812-d1d6f0422691-kube-api-access-9mlf4\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.171988 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6fecad8-69a3-4252-b812-d1d6f0422691-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.172462 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c30dadf5-37b7-47b9-883f-556482c1e8d0" (UID: "c30dadf5-37b7-47b9-883f-556482c1e8d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.176348 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5724b55b-9e2b-481e-8850-95a521a27999" (UID: "5724b55b-9e2b-481e-8850-95a521a27999"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.177763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31ff518a-27b9-4b86-a833-9e45e65104e2" (UID: "31ff518a-27b9-4b86-a833-9e45e65104e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.178277 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4de11318-8cb2-44c8-ab01-41be6b3cd1c8" (UID: "4de11318-8cb2-44c8-ab01-41be6b3cd1c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.182240 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6" (OuterVolumeSpecName: "kube-api-access-spzh6") pod "31ff518a-27b9-4b86-a833-9e45e65104e2" (UID: "31ff518a-27b9-4b86-a833-9e45e65104e2"). InnerVolumeSpecName "kube-api-access-spzh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.182932 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4" (OuterVolumeSpecName: "kube-api-access-dt7m4") pod "5724b55b-9e2b-481e-8850-95a521a27999" (UID: "5724b55b-9e2b-481e-8850-95a521a27999"). InnerVolumeSpecName "kube-api-access-dt7m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.183955 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl" (OuterVolumeSpecName: "kube-api-access-tpnjl") pod "4de11318-8cb2-44c8-ab01-41be6b3cd1c8" (UID: "4de11318-8cb2-44c8-ab01-41be6b3cd1c8"). InnerVolumeSpecName "kube-api-access-tpnjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.186630 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn" (OuterVolumeSpecName: "kube-api-access-4k9qn") pod "c30dadf5-37b7-47b9-883f-556482c1e8d0" (UID: "c30dadf5-37b7-47b9-883f-556482c1e8d0"). InnerVolumeSpecName "kube-api-access-4k9qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274278 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c30dadf5-37b7-47b9-883f-556482c1e8d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274698 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpnjl\" (UniqueName: \"kubernetes.io/projected/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-kube-api-access-tpnjl\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274711 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5724b55b-9e2b-481e-8850-95a521a27999-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274724 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzh6\" (UniqueName: \"kubernetes.io/projected/31ff518a-27b9-4b86-a833-9e45e65104e2-kube-api-access-spzh6\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274736 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31ff518a-27b9-4b86-a833-9e45e65104e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274748 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k9qn\" (UniqueName: \"kubernetes.io/projected/c30dadf5-37b7-47b9-883f-556482c1e8d0-kube-api-access-4k9qn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274761 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt7m4\" (UniqueName: \"kubernetes.io/projected/5724b55b-9e2b-481e-8850-95a521a27999-kube-api-access-dt7m4\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.274773 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de11318-8cb2-44c8-ab01-41be6b3cd1c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.995445 4546 generic.go:334] "Generic (PLEG): container finished" podID="e916bb26-02e3-4748-b195-0bb5bf550f71" containerID="e70b3326ebad4e85b3cf2c4dd2a9a14a86c06473d000a357f1437d660a9f8322" exitCode=0 Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.996004 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7ee6-account-create-update-pn854" event={"ID":"e916bb26-02e3-4748-b195-0bb5bf550f71","Type":"ContainerDied","Data":"e70b3326ebad4e85b3cf2c4dd2a9a14a86c06473d000a357f1437d660a9f8322"} Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.997759 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhf5x" event={"ID":"38bd4178-697d-4013-a31e-573c439c9517","Type":"ContainerStarted","Data":"24183fa9a4cdd850181295985ab038d234011899b428e94dbfb61e117172d958"} Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.999062 4546 generic.go:334] "Generic (PLEG): container finished" podID="486e299d-c690-4f46-8197-e531c53a8b11" containerID="e9b6c6995fb811aedaa272c2f9041e4e526fb9e4824350e565a424423c687e95" exitCode=0 Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.999156 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rsx9q" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.999313 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3734-account-create-update-h67dn" Feb 01 06:56:04 crc kubenswrapper[4546]: I0201 06:56:04.999789 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hs7bh" event={"ID":"486e299d-c690-4f46-8197-e531c53a8b11","Type":"ContainerDied","Data":"e9b6c6995fb811aedaa272c2f9041e4e526fb9e4824350e565a424423c687e95"} Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.000025 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-25ztg" Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.010502 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8d7b7" Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.012212 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8fd9-account-create-update-cgcz9" Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.074191 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mhf5x" podStartSLOduration=2.063503233 podStartE2EDuration="7.074166025s" podCreationTimestamp="2026-02-01 06:55:58 +0000 UTC" firstStartedPulling="2026-02-01 06:55:58.909479812 +0000 UTC m=+789.560415828" lastFinishedPulling="2026-02-01 06:56:03.920142605 +0000 UTC m=+794.571078620" observedRunningTime="2026-02-01 06:56:05.060392341 +0000 UTC m=+795.711328356" watchObservedRunningTime="2026-02-01 06:56:05.074166025 +0000 UTC m=+795.725102041" Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.612089 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.686547 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:56:05 crc kubenswrapper[4546]: I0201 06:56:05.686733 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="dnsmasq-dns" containerID="cri-o://0596565a16df24567f2038eb908a208c5f1b2de43012231a2ed636c16f5b8eae" gracePeriod=10 Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.015455 4546 generic.go:334] "Generic (PLEG): container finished" podID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerID="0596565a16df24567f2038eb908a208c5f1b2de43012231a2ed636c16f5b8eae" exitCode=0 Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.015525 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" event={"ID":"ffe0876a-46be-41a8-8ca5-d99e28e349a6","Type":"ContainerDied","Data":"0596565a16df24567f2038eb908a208c5f1b2de43012231a2ed636c16f5b8eae"} Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.170526 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.220328 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6wkp\" (UniqueName: \"kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp\") pod \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.220380 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc\") pod \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.220530 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config\") pod \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\" (UID: \"ffe0876a-46be-41a8-8ca5-d99e28e349a6\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.255992 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp" (OuterVolumeSpecName: "kube-api-access-k6wkp") pod "ffe0876a-46be-41a8-8ca5-d99e28e349a6" (UID: "ffe0876a-46be-41a8-8ca5-d99e28e349a6"). InnerVolumeSpecName "kube-api-access-k6wkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.269702 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config" (OuterVolumeSpecName: "config") pod "ffe0876a-46be-41a8-8ca5-d99e28e349a6" (UID: "ffe0876a-46be-41a8-8ca5-d99e28e349a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.305469 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffe0876a-46be-41a8-8ca5-d99e28e349a6" (UID: "ffe0876a-46be-41a8-8ca5-d99e28e349a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.325124 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6wkp\" (UniqueName: \"kubernetes.io/projected/ffe0876a-46be-41a8-8ca5-d99e28e349a6-kube-api-access-k6wkp\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.325165 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.325177 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe0876a-46be-41a8-8ca5-d99e28e349a6-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.352441 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.358555 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.425952 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5j59\" (UniqueName: \"kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59\") pod \"486e299d-c690-4f46-8197-e531c53a8b11\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.426001 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2tcp\" (UniqueName: \"kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp\") pod \"e916bb26-02e3-4748-b195-0bb5bf550f71\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.426846 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts\") pod \"e916bb26-02e3-4748-b195-0bb5bf550f71\" (UID: \"e916bb26-02e3-4748-b195-0bb5bf550f71\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.426893 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts\") pod \"486e299d-c690-4f46-8197-e531c53a8b11\" (UID: \"486e299d-c690-4f46-8197-e531c53a8b11\") " Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.427404 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e916bb26-02e3-4748-b195-0bb5bf550f71" (UID: "e916bb26-02e3-4748-b195-0bb5bf550f71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.427818 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "486e299d-c690-4f46-8197-e531c53a8b11" (UID: "486e299d-c690-4f46-8197-e531c53a8b11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.429594 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp" (OuterVolumeSpecName: "kube-api-access-x2tcp") pod "e916bb26-02e3-4748-b195-0bb5bf550f71" (UID: "e916bb26-02e3-4748-b195-0bb5bf550f71"). InnerVolumeSpecName "kube-api-access-x2tcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.430135 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59" (OuterVolumeSpecName: "kube-api-access-l5j59") pod "486e299d-c690-4f46-8197-e531c53a8b11" (UID: "486e299d-c690-4f46-8197-e531c53a8b11"). InnerVolumeSpecName "kube-api-access-l5j59". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.528537 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e916bb26-02e3-4748-b195-0bb5bf550f71-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.528576 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/486e299d-c690-4f46-8197-e531c53a8b11-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.528588 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5j59\" (UniqueName: \"kubernetes.io/projected/486e299d-c690-4f46-8197-e531c53a8b11-kube-api-access-l5j59\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:06 crc kubenswrapper[4546]: I0201 06:56:06.528600 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2tcp\" (UniqueName: \"kubernetes.io/projected/e916bb26-02e3-4748-b195-0bb5bf550f71-kube-api-access-x2tcp\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.026410 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hs7bh" event={"ID":"486e299d-c690-4f46-8197-e531c53a8b11","Type":"ContainerDied","Data":"dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101"} Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.027110 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd085dbef1a5ab1f156ccb542dc719074dab5ef435c278cc5d5150d07834101" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.026438 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hs7bh" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.028524 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7ee6-account-create-update-pn854" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.028577 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7ee6-account-create-update-pn854" event={"ID":"e916bb26-02e3-4748-b195-0bb5bf550f71","Type":"ContainerDied","Data":"08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26"} Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.028691 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08752b0578b15cf33b7098c2e33845a1399febbe568c0fbe7441c8aa969d3e26" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.030737 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" event={"ID":"ffe0876a-46be-41a8-8ca5-d99e28e349a6","Type":"ContainerDied","Data":"f897537a8c8297c1eab5e212adc08d0161f7c2121a46f178290b12d8b572e427"} Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.030791 4546 scope.go:117] "RemoveContainer" containerID="0596565a16df24567f2038eb908a208c5f1b2de43012231a2ed636c16f5b8eae" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.030832 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb7fd957f-mlbgp" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.049599 4546 scope.go:117] "RemoveContainer" containerID="e6e8515ed52b771380fb00ddbbaadbcf2dbd128e1b9468f0ba1291f572892701" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.086003 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.098457 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb7fd957f-mlbgp"] Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.634472 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-25ztg"] Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.668838 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" path="/var/lib/kubelet/pods/ffe0876a-46be-41a8-8ca5-d99e28e349a6/volumes" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.674564 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-25ztg"] Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.739780 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qx4ns"] Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740256 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486e299d-c690-4f46-8197-e531c53a8b11" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740273 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="486e299d-c690-4f46-8197-e531c53a8b11" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740300 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ff518a-27b9-4b86-a833-9e45e65104e2" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740306 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ff518a-27b9-4b86-a833-9e45e65104e2" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740331 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fecad8-69a3-4252-b812-d1d6f0422691" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740336 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fecad8-69a3-4252-b812-d1d6f0422691" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740343 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="dnsmasq-dns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740349 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="dnsmasq-dns" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740361 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e916bb26-02e3-4748-b195-0bb5bf550f71" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740367 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e916bb26-02e3-4748-b195-0bb5bf550f71" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740378 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de11318-8cb2-44c8-ab01-41be6b3cd1c8" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740384 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de11318-8cb2-44c8-ab01-41be6b3cd1c8" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740396 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5724b55b-9e2b-481e-8850-95a521a27999" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740401 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5724b55b-9e2b-481e-8850-95a521a27999" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740413 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30dadf5-37b7-47b9-883f-556482c1e8d0" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740428 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30dadf5-37b7-47b9-883f-556482c1e8d0" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: E0201 06:56:07.740437 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="init" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740442 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="init" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740602 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30dadf5-37b7-47b9-883f-556482c1e8d0" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740616 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ff518a-27b9-4b86-a833-9e45e65104e2" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740625 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e916bb26-02e3-4748-b195-0bb5bf550f71" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740634 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de11318-8cb2-44c8-ab01-41be6b3cd1c8" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740639 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="486e299d-c690-4f46-8197-e531c53a8b11" containerName="mariadb-database-create" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740646 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe0876a-46be-41a8-8ca5-d99e28e349a6" containerName="dnsmasq-dns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740654 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="5724b55b-9e2b-481e-8850-95a521a27999" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.740663 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fecad8-69a3-4252-b812-d1d6f0422691" containerName="mariadb-account-create-update" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.741344 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.743295 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.749245 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qx4ns"] Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.756717 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgvj\" (UniqueName: \"kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.756869 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.858956 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.859039 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgvj\" (UniqueName: \"kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.860099 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:07 crc kubenswrapper[4546]: I0201 06:56:07.880765 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgvj\" (UniqueName: \"kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj\") pod \"root-account-create-update-qx4ns\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:08 crc kubenswrapper[4546]: I0201 06:56:08.056992 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:08 crc kubenswrapper[4546]: I0201 06:56:08.477841 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qx4ns"] Feb 01 06:56:08 crc kubenswrapper[4546]: W0201 06:56:08.496982 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f0debea_5891_4ea6_8109_b55288b0d265.slice/crio-a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0 WatchSource:0}: Error finding container a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0: Status 404 returned error can't find the container with id a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0 Feb 01 06:56:09 crc kubenswrapper[4546]: I0201 06:56:09.054589 4546 generic.go:334] "Generic (PLEG): container finished" podID="5f0debea-5891-4ea6-8109-b55288b0d265" containerID="dee064dc14987663f756eb4931182ab2deefc780c8cfee58b0b3ca9c9b214e18" exitCode=0 Feb 01 06:56:09 crc kubenswrapper[4546]: I0201 06:56:09.054897 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qx4ns" event={"ID":"5f0debea-5891-4ea6-8109-b55288b0d265","Type":"ContainerDied","Data":"dee064dc14987663f756eb4931182ab2deefc780c8cfee58b0b3ca9c9b214e18"} Feb 01 06:56:09 crc kubenswrapper[4546]: I0201 06:56:09.054926 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qx4ns" event={"ID":"5f0debea-5891-4ea6-8109-b55288b0d265","Type":"ContainerStarted","Data":"a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0"} Feb 01 06:56:09 crc kubenswrapper[4546]: I0201 06:56:09.667569 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fecad8-69a3-4252-b812-d1d6f0422691" path="/var/lib/kubelet/pods/e6fecad8-69a3-4252-b812-d1d6f0422691/volumes" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.065143 4546 generic.go:334] "Generic (PLEG): container finished" podID="38bd4178-697d-4013-a31e-573c439c9517" containerID="24183fa9a4cdd850181295985ab038d234011899b428e94dbfb61e117172d958" exitCode=0 Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.065232 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhf5x" event={"ID":"38bd4178-697d-4013-a31e-573c439c9517","Type":"ContainerDied","Data":"24183fa9a4cdd850181295985ab038d234011899b428e94dbfb61e117172d958"} Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.104997 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.127353 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00f597d1-7dec-4229-9b1c-eebfb6958694-etc-swift\") pod \"swift-storage-0\" (UID: \"00f597d1-7dec-4229-9b1c-eebfb6958694\") " pod="openstack/swift-storage-0" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.152361 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.377416 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.411717 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cgvj\" (UniqueName: \"kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj\") pod \"5f0debea-5891-4ea6-8109-b55288b0d265\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.411770 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts\") pod \"5f0debea-5891-4ea6-8109-b55288b0d265\" (UID: \"5f0debea-5891-4ea6-8109-b55288b0d265\") " Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.413200 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f0debea-5891-4ea6-8109-b55288b0d265" (UID: "5f0debea-5891-4ea6-8109-b55288b0d265"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.419097 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj" (OuterVolumeSpecName: "kube-api-access-9cgvj") pod "5f0debea-5891-4ea6-8109-b55288b0d265" (UID: "5f0debea-5891-4ea6-8109-b55288b0d265"). InnerVolumeSpecName "kube-api-access-9cgvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.514337 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cgvj\" (UniqueName: \"kubernetes.io/projected/5f0debea-5891-4ea6-8109-b55288b0d265-kube-api-access-9cgvj\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.514374 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0debea-5891-4ea6-8109-b55288b0d265-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:10 crc kubenswrapper[4546]: I0201 06:56:10.661986 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.075640 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"a25f3ac8f15ff481ffcb61a7e557952e7804f517878afd7167f72f2e5c252b77"} Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.078634 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qx4ns" event={"ID":"5f0debea-5891-4ea6-8109-b55288b0d265","Type":"ContainerDied","Data":"a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0"} Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.078772 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81be165bae0a7b88fde6610587f70232bae56447b826c35bb71fbd167081ab0" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.078687 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qx4ns" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.216310 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sscjj"] Feb 01 06:56:11 crc kubenswrapper[4546]: E0201 06:56:11.217816 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0debea-5891-4ea6-8109-b55288b0d265" containerName="mariadb-account-create-update" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.217876 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0debea-5891-4ea6-8109-b55288b0d265" containerName="mariadb-account-create-update" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.218355 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0debea-5891-4ea6-8109-b55288b0d265" containerName="mariadb-account-create-update" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.219499 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.223285 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.228263 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sscjj"] Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.228569 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ht5t7" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.332199 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.332386 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.332438 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.332629 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdcnd\" (UniqueName: \"kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.412441 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.434029 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.434073 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.434139 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdcnd\" (UniqueName: \"kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.434178 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.450042 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.452092 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.463296 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.473288 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdcnd\" (UniqueName: \"kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd\") pod \"glance-db-sync-sscjj\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535128 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535359 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535543 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535633 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535728 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535891 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.535962 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8p5z\" (UniqueName: \"kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z\") pod \"38bd4178-697d-4013-a31e-573c439c9517\" (UID: \"38bd4178-697d-4013-a31e-573c439c9517\") " Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.539431 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.540372 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.551164 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z" (OuterVolumeSpecName: "kube-api-access-x8p5z") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "kube-api-access-x8p5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.557930 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sscjj" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.578126 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.578467 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts" (OuterVolumeSpecName: "scripts") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.584076 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.586003 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "38bd4178-697d-4013-a31e-573c439c9517" (UID: "38bd4178-697d-4013-a31e-573c439c9517"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.638986 4546 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639017 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639029 4546 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639039 4546 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38bd4178-697d-4013-a31e-573c439c9517-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639049 4546 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38bd4178-697d-4013-a31e-573c439c9517-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639060 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bd4178-697d-4013-a31e-573c439c9517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:11 crc kubenswrapper[4546]: I0201 06:56:11.639068 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8p5z\" (UniqueName: \"kubernetes.io/projected/38bd4178-697d-4013-a31e-573c439c9517-kube-api-access-x8p5z\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:12 crc kubenswrapper[4546]: I0201 06:56:12.093890 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sscjj"] Feb 01 06:56:12 crc kubenswrapper[4546]: I0201 06:56:12.097305 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mhf5x" event={"ID":"38bd4178-697d-4013-a31e-573c439c9517","Type":"ContainerDied","Data":"38644f846f81357a421773572c0eab925d85a67b559e506b4531c9daf3b078dc"} Feb 01 06:56:12 crc kubenswrapper[4546]: I0201 06:56:12.097352 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38644f846f81357a421773572c0eab925d85a67b559e506b4531c9daf3b078dc" Feb 01 06:56:12 crc kubenswrapper[4546]: I0201 06:56:12.097418 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mhf5x" Feb 01 06:56:12 crc kubenswrapper[4546]: W0201 06:56:12.171615 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bf01534_1b7d_4f23_bc2c_02cb329a2036.slice/crio-3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377 WatchSource:0}: Error finding container 3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377: Status 404 returned error can't find the container with id 3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377 Feb 01 06:56:13 crc kubenswrapper[4546]: I0201 06:56:13.111677 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"84bed7b3a38d7e4d9b9754c1a385702965df2c56d2ace20335e4695cfe15301c"} Feb 01 06:56:13 crc kubenswrapper[4546]: I0201 06:56:13.113051 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"0d4be5aad025203e3c087b17088372675ccbb0d5851af99e58baf00dc2896904"} Feb 01 06:56:13 crc kubenswrapper[4546]: I0201 06:56:13.113151 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"2d1db288b579b4ed8de3c77970f7232f2553ca1a511805c9bf5d434a6423a22c"} Feb 01 06:56:13 crc kubenswrapper[4546]: I0201 06:56:13.113214 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"b6f0a8e8cddfc72ef03e1e733170973debb251e1d0c617eec92a3086e28fcb6f"} Feb 01 06:56:13 crc kubenswrapper[4546]: I0201 06:56:13.114347 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sscjj" event={"ID":"2bf01534-1b7d-4f23-bc2c-02cb329a2036","Type":"ContainerStarted","Data":"3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377"} Feb 01 06:56:14 crc kubenswrapper[4546]: I0201 06:56:14.079253 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qx4ns"] Feb 01 06:56:14 crc kubenswrapper[4546]: I0201 06:56:14.086357 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qx4ns"] Feb 01 06:56:15 crc kubenswrapper[4546]: I0201 06:56:15.664291 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0debea-5891-4ea6-8109-b55288b0d265" path="/var/lib/kubelet/pods/5f0debea-5891-4ea6-8109-b55288b0d265/volumes" Feb 01 06:56:15 crc kubenswrapper[4546]: I0201 06:56:15.813889 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 01 06:56:16 crc kubenswrapper[4546]: I0201 06:56:16.150704 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"eb7b8bed3365dd5c191abdf6506c61f69371ca8dfe29a53400bb7341adf96d23"} Feb 01 06:56:16 crc kubenswrapper[4546]: I0201 06:56:16.150789 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"225858d0cf75b0353856f07718a787bede14c748baa2b7358da9fcac79207f7b"} Feb 01 06:56:16 crc kubenswrapper[4546]: I0201 06:56:16.150821 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"446873a921ea44295f605c75373df35ca2b6dca43fb95b6899ad0eca5c70bf40"} Feb 01 06:56:16 crc kubenswrapper[4546]: I0201 06:56:16.150836 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"1e2eb21f63a5d952007686fe80f8ced34f16b3b7f05dea147801a9f0f17f3b3f"} Feb 01 06:56:18 crc kubenswrapper[4546]: I0201 06:56:18.171378 4546 generic.go:334] "Generic (PLEG): container finished" podID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerID="9ec81dd258fc5363154282f0f86b3edb322ae34700105e5e89c739bb777690b0" exitCode=0 Feb 01 06:56:18 crc kubenswrapper[4546]: I0201 06:56:18.171486 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerDied","Data":"9ec81dd258fc5363154282f0f86b3edb322ae34700105e5e89c739bb777690b0"} Feb 01 06:56:18 crc kubenswrapper[4546]: I0201 06:56:18.176030 4546 generic.go:334] "Generic (PLEG): container finished" podID="f9259854-6c00-413e-9061-399c808d9360" containerID="de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9" exitCode=0 Feb 01 06:56:18 crc kubenswrapper[4546]: I0201 06:56:18.176101 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerDied","Data":"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9"} Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.114920 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rt9zh"] Feb 01 06:56:19 crc kubenswrapper[4546]: E0201 06:56:19.115919 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bd4178-697d-4013-a31e-573c439c9517" containerName="swift-ring-rebalance" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.115950 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bd4178-697d-4013-a31e-573c439c9517" containerName="swift-ring-rebalance" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.116167 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bd4178-697d-4013-a31e-573c439c9517" containerName="swift-ring-rebalance" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.117101 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.123282 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.138926 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rt9zh"] Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.194878 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerStarted","Data":"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5"} Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.196075 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.202497 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerStarted","Data":"618ce162921f9b2a8ee7ddb9d0c2ca5cb307fca3dff89e03352d2a264ff4e972"} Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.202717 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.236688 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.08963533 podStartE2EDuration="1m3.236673674s" podCreationTimestamp="2026-02-01 06:55:16 +0000 UTC" firstStartedPulling="2026-02-01 06:55:18.332149324 +0000 UTC m=+748.983085340" lastFinishedPulling="2026-02-01 06:55:43.479187668 +0000 UTC m=+774.130123684" observedRunningTime="2026-02-01 06:56:19.227743982 +0000 UTC m=+809.878679998" watchObservedRunningTime="2026-02-01 06:56:19.236673674 +0000 UTC m=+809.887609690" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.255239 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.149232724 podStartE2EDuration="1m3.255219945s" podCreationTimestamp="2026-02-01 06:55:16 +0000 UTC" firstStartedPulling="2026-02-01 06:55:18.373137127 +0000 UTC m=+749.024073133" lastFinishedPulling="2026-02-01 06:55:43.479124338 +0000 UTC m=+774.130060354" observedRunningTime="2026-02-01 06:56:19.24684078 +0000 UTC m=+809.897776796" watchObservedRunningTime="2026-02-01 06:56:19.255219945 +0000 UTC m=+809.906155961" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.294800 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.295369 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t674p\" (UniqueName: \"kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.397313 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t674p\" (UniqueName: \"kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.397397 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.398157 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.417564 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t674p\" (UniqueName: \"kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p\") pod \"root-account-create-update-rt9zh\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.441945 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:19 crc kubenswrapper[4546]: I0201 06:56:19.958799 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rt9zh"] Feb 01 06:56:19 crc kubenswrapper[4546]: W0201 06:56:19.967178 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715ff32b_ef52_42e4_8f3c_7e88c6612a20.slice/crio-3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65 WatchSource:0}: Error finding container 3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65: Status 404 returned error can't find the container with id 3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65 Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.215022 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rt9zh" event={"ID":"715ff32b-ef52-42e4-8f3c-7e88c6612a20","Type":"ContainerStarted","Data":"3fd182c0c1a7fc96ba75e18ef1b9ecac974ad33f103751f8002ae0932f083aa2"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.215072 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rt9zh" event={"ID":"715ff32b-ef52-42e4-8f3c-7e88c6612a20","Type":"ContainerStarted","Data":"3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.243903 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"25bd18a9e8d6b9abfc1dcfdb96a34051273137b244182c0f41dd7da25d224035"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.243965 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"0a1e52b32f9d76550145113fa888ce0f6dac7012a9e6458c5d7cf0093aee99bd"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.243977 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"03c90dc02639d9e612415ca42c840d392c2ffd3b9aec6a85f60a90b2e2c31cde"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.243986 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"1f1c723a8df8dc3e0ab2571cfd8671dadf9795120902b9f6fb21372c0efe8f9c"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.243996 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"d65e6c0cc6d4f739ae0c9da1edf08f6bd1c56b6024f0663305d27ccb2aa35f76"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.244005 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"fc7b7195cf40966cd6c34e4ea9d5525e9237ad7bca87705f27efd27898ee14e2"} Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.248746 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-rt9zh" podStartSLOduration=1.248736595 podStartE2EDuration="1.248736595s" podCreationTimestamp="2026-02-01 06:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:56:20.245128102 +0000 UTC m=+810.896064118" watchObservedRunningTime="2026-02-01 06:56:20.248736595 +0000 UTC m=+810.899672611" Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.839158 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-fw66k" podUID="7f0f1cbe-76e7-455a-80da-05602295973b" containerName="ovn-controller" probeResult="failure" output=< Feb 01 06:56:20 crc kubenswrapper[4546]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 01 06:56:20 crc kubenswrapper[4546]: > Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.868294 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:56:20 crc kubenswrapper[4546]: I0201 06:56:20.872097 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4x5w6" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.119149 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fw66k-config-5ntps"] Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.120982 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.128252 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.136391 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k-config-5ntps"] Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.237949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.238033 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.238206 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.238282 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9r5\" (UniqueName: \"kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.238306 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.238359 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.264293 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"00f597d1-7dec-4229-9b1c-eebfb6958694","Type":"ContainerStarted","Data":"31f01b540f4634129fa73f2d18d6f55a2f5563ac52d7bb7614c95b5200daea95"} Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.267093 4546 generic.go:334] "Generic (PLEG): container finished" podID="715ff32b-ef52-42e4-8f3c-7e88c6612a20" containerID="3fd182c0c1a7fc96ba75e18ef1b9ecac974ad33f103751f8002ae0932f083aa2" exitCode=0 Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.267515 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rt9zh" event={"ID":"715ff32b-ef52-42e4-8f3c-7e88c6612a20","Type":"ContainerDied","Data":"3fd182c0c1a7fc96ba75e18ef1b9ecac974ad33f103751f8002ae0932f083aa2"} Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.344698 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.344814 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.344947 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9r5\" (UniqueName: \"kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.344972 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.345027 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.345090 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.345889 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.348317 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.349778 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.350833 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.351814 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.366537 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.964829895 podStartE2EDuration="28.36651432s" podCreationTimestamp="2026-02-01 06:55:53 +0000 UTC" firstStartedPulling="2026-02-01 06:56:10.671884571 +0000 UTC m=+801.322820577" lastFinishedPulling="2026-02-01 06:56:19.073568986 +0000 UTC m=+809.724505002" observedRunningTime="2026-02-01 06:56:21.315337751 +0000 UTC m=+811.966273757" watchObservedRunningTime="2026-02-01 06:56:21.36651432 +0000 UTC m=+812.017450336" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.369572 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9r5\" (UniqueName: \"kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5\") pod \"ovn-controller-fw66k-config-5ntps\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.443573 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.633052 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.634573 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.649660 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.702784 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.755218 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k-config-5ntps"] Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.755703 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.755799 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.755912 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.755975 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.756080 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.756205 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6jln\" (UniqueName: \"kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859061 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6jln\" (UniqueName: \"kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859141 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859251 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859306 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859400 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.859543 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.861220 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.862463 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.863776 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.864457 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.865659 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.880029 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6jln\" (UniqueName: \"kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln\") pod \"dnsmasq-dns-9bb7bbd45-zmgsm\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:21 crc kubenswrapper[4546]: I0201 06:56:21.969774 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.282930 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-5ntps" event={"ID":"9cf29ca8-566b-4456-8aa6-b93bfaf703f6","Type":"ContainerStarted","Data":"91b9bce1ce39d7be6584b1e8919e1537f07fc027dc46be0fd6b79ceb47b68739"} Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.283315 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-5ntps" event={"ID":"9cf29ca8-566b-4456-8aa6-b93bfaf703f6","Type":"ContainerStarted","Data":"2cbe9bed8c580d25dfdbabf710193d596479794fdac4c7b554eb0dad01316472"} Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.327821 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-fw66k-config-5ntps" podStartSLOduration=1.327794342 podStartE2EDuration="1.327794342s" podCreationTimestamp="2026-02-01 06:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:56:22.319627467 +0000 UTC m=+812.970563484" watchObservedRunningTime="2026-02-01 06:56:22.327794342 +0000 UTC m=+812.978730359" Feb 01 06:56:22 crc kubenswrapper[4546]: W0201 06:56:22.511578 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf952bfa_8c8c_4601_8ea9_f8ac259a7831.slice/crio-6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba WatchSource:0}: Error finding container 6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba: Status 404 returned error can't find the container with id 6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.524328 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.883974 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.982656 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t674p\" (UniqueName: \"kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p\") pod \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.983554 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts\") pod \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\" (UID: \"715ff32b-ef52-42e4-8f3c-7e88c6612a20\") " Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.984542 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "715ff32b-ef52-42e4-8f3c-7e88c6612a20" (UID: "715ff32b-ef52-42e4-8f3c-7e88c6612a20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.985104 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/715ff32b-ef52-42e4-8f3c-7e88c6612a20-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:22 crc kubenswrapper[4546]: I0201 06:56:22.990264 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p" (OuterVolumeSpecName: "kube-api-access-t674p") pod "715ff32b-ef52-42e4-8f3c-7e88c6612a20" (UID: "715ff32b-ef52-42e4-8f3c-7e88c6612a20"). InnerVolumeSpecName "kube-api-access-t674p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.087522 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t674p\" (UniqueName: \"kubernetes.io/projected/715ff32b-ef52-42e4-8f3c-7e88c6612a20-kube-api-access-t674p\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.311153 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rt9zh" Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.317134 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rt9zh" event={"ID":"715ff32b-ef52-42e4-8f3c-7e88c6612a20","Type":"ContainerDied","Data":"3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65"} Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.317201 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ea5f85181fd19a2c40f05c043f35a2db2e30f7eef7cd9e4bf79591787bacd65" Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.319667 4546 generic.go:334] "Generic (PLEG): container finished" podID="9cf29ca8-566b-4456-8aa6-b93bfaf703f6" containerID="91b9bce1ce39d7be6584b1e8919e1537f07fc027dc46be0fd6b79ceb47b68739" exitCode=0 Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.320143 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-5ntps" event={"ID":"9cf29ca8-566b-4456-8aa6-b93bfaf703f6","Type":"ContainerDied","Data":"91b9bce1ce39d7be6584b1e8919e1537f07fc027dc46be0fd6b79ceb47b68739"} Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.321943 4546 generic.go:334] "Generic (PLEG): container finished" podID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerID="7d53d77a49298035e2d463605f6879ba3935311b8c9719d2e624c97d587d6bad" exitCode=0 Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.321966 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" event={"ID":"cf952bfa-8c8c-4601-8ea9-f8ac259a7831","Type":"ContainerDied","Data":"7d53d77a49298035e2d463605f6879ba3935311b8c9719d2e624c97d587d6bad"} Feb 01 06:56:23 crc kubenswrapper[4546]: I0201 06:56:23.321983 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" event={"ID":"cf952bfa-8c8c-4601-8ea9-f8ac259a7831","Type":"ContainerStarted","Data":"6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba"} Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.340521 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" event={"ID":"cf952bfa-8c8c-4601-8ea9-f8ac259a7831","Type":"ContainerStarted","Data":"c1c094bd713953a8c2476ed5f01fd66f263e757dbc3cd9d624f3a1677a9fc68e"} Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.340655 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.367894 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podStartSLOduration=3.367880748 podStartE2EDuration="3.367880748s" podCreationTimestamp="2026-02-01 06:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:56:24.361720084 +0000 UTC m=+815.012656100" watchObservedRunningTime="2026-02-01 06:56:24.367880748 +0000 UTC m=+815.018816764" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.682524 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.816376 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf9r5\" (UniqueName: \"kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.817525 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.817611 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.817739 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.817968 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818070 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn\") pod \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\" (UID: \"9cf29ca8-566b-4456-8aa6-b93bfaf703f6\") " Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818552 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818620 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run" (OuterVolumeSpecName: "var-run") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818649 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818674 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.818849 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts" (OuterVolumeSpecName: "scripts") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.819177 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.819203 4546 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.819221 4546 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.819233 4546 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.819245 4546 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.837067 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5" (OuterVolumeSpecName: "kube-api-access-kf9r5") pod "9cf29ca8-566b-4456-8aa6-b93bfaf703f6" (UID: "9cf29ca8-566b-4456-8aa6-b93bfaf703f6"). InnerVolumeSpecName "kube-api-access-kf9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:24 crc kubenswrapper[4546]: I0201 06:56:24.921492 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf9r5\" (UniqueName: \"kubernetes.io/projected/9cf29ca8-566b-4456-8aa6-b93bfaf703f6-kube-api-access-kf9r5\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.348462 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-5ntps" event={"ID":"9cf29ca8-566b-4456-8aa6-b93bfaf703f6","Type":"ContainerDied","Data":"2cbe9bed8c580d25dfdbabf710193d596479794fdac4c7b554eb0dad01316472"} Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.348525 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-5ntps" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.348530 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbe9bed8c580d25dfdbabf710193d596479794fdac4c7b554eb0dad01316472" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.420742 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fw66k-config-5ntps"] Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.429478 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fw66k-config-5ntps"] Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.559683 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-fw66k-config-scwhj"] Feb 01 06:56:25 crc kubenswrapper[4546]: E0201 06:56:25.560273 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf29ca8-566b-4456-8aa6-b93bfaf703f6" containerName="ovn-config" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.560352 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf29ca8-566b-4456-8aa6-b93bfaf703f6" containerName="ovn-config" Feb 01 06:56:25 crc kubenswrapper[4546]: E0201 06:56:25.560435 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715ff32b-ef52-42e4-8f3c-7e88c6612a20" containerName="mariadb-account-create-update" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.560498 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="715ff32b-ef52-42e4-8f3c-7e88c6612a20" containerName="mariadb-account-create-update" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.560749 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="715ff32b-ef52-42e4-8f3c-7e88c6612a20" containerName="mariadb-account-create-update" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.560815 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf29ca8-566b-4456-8aa6-b93bfaf703f6" containerName="ovn-config" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.561513 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.564315 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.584389 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k-config-scwhj"] Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636018 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636171 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636206 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636245 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbf7l\" (UniqueName: \"kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636266 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.636314 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.668525 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf29ca8-566b-4456-8aa6-b93bfaf703f6" path="/var/lib/kubelet/pods/9cf29ca8-566b-4456-8aa6-b93bfaf703f6/volumes" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.738496 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.739023 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.739227 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.739280 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.739362 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbf7l\" (UniqueName: \"kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.739397 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.740337 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.740346 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.740510 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.741511 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.742110 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.766212 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbf7l\" (UniqueName: \"kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l\") pod \"ovn-controller-fw66k-config-scwhj\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.859556 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-fw66k" Feb 01 06:56:25 crc kubenswrapper[4546]: I0201 06:56:25.881247 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:26 crc kubenswrapper[4546]: I0201 06:56:26.293056 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-fw66k-config-scwhj"] Feb 01 06:56:26 crc kubenswrapper[4546]: I0201 06:56:26.378480 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-scwhj" event={"ID":"e13413d4-5a29-4b1a-b717-76add98c9a19","Type":"ContainerStarted","Data":"a46e3f4fb645480e894fc8e036f27565d12f19af54d8ef293618e770196fe2ec"} Feb 01 06:56:27 crc kubenswrapper[4546]: I0201 06:56:27.387630 4546 generic.go:334] "Generic (PLEG): container finished" podID="e13413d4-5a29-4b1a-b717-76add98c9a19" containerID="5b13d6e367fce99405d6116e016d5e5e8001d97ba89a91a14de572c1e7865c95" exitCode=0 Feb 01 06:56:27 crc kubenswrapper[4546]: I0201 06:56:27.388080 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-scwhj" event={"ID":"e13413d4-5a29-4b1a-b717-76add98c9a19","Type":"ContainerDied","Data":"5b13d6e367fce99405d6116e016d5e5e8001d97ba89a91a14de572c1e7865c95"} Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.692532 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829294 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829414 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbf7l\" (UniqueName: \"kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829681 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829725 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829717 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829782 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829833 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829879 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts\") pod \"e13413d4-5a29-4b1a-b717-76add98c9a19\" (UID: \"e13413d4-5a29-4b1a-b717-76add98c9a19\") " Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.829924 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run" (OuterVolumeSpecName: "var-run") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.830391 4546 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.830413 4546 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.830422 4546 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e13413d4-5a29-4b1a-b717-76add98c9a19-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.830427 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.830703 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts" (OuterVolumeSpecName: "scripts") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.836256 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l" (OuterVolumeSpecName: "kube-api-access-fbf7l") pod "e13413d4-5a29-4b1a-b717-76add98c9a19" (UID: "e13413d4-5a29-4b1a-b717-76add98c9a19"). InnerVolumeSpecName "kube-api-access-fbf7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.933316 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbf7l\" (UniqueName: \"kubernetes.io/projected/e13413d4-5a29-4b1a-b717-76add98c9a19-kube-api-access-fbf7l\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.933714 4546 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:28 crc kubenswrapper[4546]: I0201 06:56:28.933759 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e13413d4-5a29-4b1a-b717-76add98c9a19-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:29 crc kubenswrapper[4546]: I0201 06:56:29.406087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-fw66k-config-scwhj" event={"ID":"e13413d4-5a29-4b1a-b717-76add98c9a19","Type":"ContainerDied","Data":"a46e3f4fb645480e894fc8e036f27565d12f19af54d8ef293618e770196fe2ec"} Feb 01 06:56:29 crc kubenswrapper[4546]: I0201 06:56:29.406139 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46e3f4fb645480e894fc8e036f27565d12f19af54d8ef293618e770196fe2ec" Feb 01 06:56:29 crc kubenswrapper[4546]: I0201 06:56:29.406173 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-fw66k-config-scwhj" Feb 01 06:56:29 crc kubenswrapper[4546]: I0201 06:56:29.811188 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-fw66k-config-scwhj"] Feb 01 06:56:29 crc kubenswrapper[4546]: I0201 06:56:29.815998 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-fw66k-config-scwhj"] Feb 01 06:56:31 crc kubenswrapper[4546]: I0201 06:56:31.661872 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13413d4-5a29-4b1a-b717-76add98c9a19" path="/var/lib/kubelet/pods/e13413d4-5a29-4b1a-b717-76add98c9a19/volumes" Feb 01 06:56:31 crc kubenswrapper[4546]: I0201 06:56:31.971036 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.022210 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.022634 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6db666964f-97rr4" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="dnsmasq-dns" containerID="cri-o://883e7ee92dc9c006c2ee3b5023b11b62bca942ddbb338cc21935a5813481af6a" gracePeriod=10 Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.434379 4546 generic.go:334] "Generic (PLEG): container finished" podID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerID="883e7ee92dc9c006c2ee3b5023b11b62bca942ddbb338cc21935a5813481af6a" exitCode=0 Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.434508 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db666964f-97rr4" event={"ID":"8850f5c5-318a-476c-8125-55bfcdc24d8b","Type":"ContainerDied","Data":"883e7ee92dc9c006c2ee3b5023b11b62bca942ddbb338cc21935a5813481af6a"} Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.434791 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db666964f-97rr4" event={"ID":"8850f5c5-318a-476c-8125-55bfcdc24d8b","Type":"ContainerDied","Data":"d826fc6c7c0c4e7ddcf88e9b79607b0281abcf3005561028e51af05ad8fe7b13"} Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.434810 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d826fc6c7c0c4e7ddcf88e9b79607b0281abcf3005561028e51af05ad8fe7b13" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.451848 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.602324 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgdv7\" (UniqueName: \"kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7\") pod \"8850f5c5-318a-476c-8125-55bfcdc24d8b\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.602385 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config\") pod \"8850f5c5-318a-476c-8125-55bfcdc24d8b\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.602519 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb\") pod \"8850f5c5-318a-476c-8125-55bfcdc24d8b\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.602554 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc\") pod \"8850f5c5-318a-476c-8125-55bfcdc24d8b\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.602618 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb\") pod \"8850f5c5-318a-476c-8125-55bfcdc24d8b\" (UID: \"8850f5c5-318a-476c-8125-55bfcdc24d8b\") " Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.608166 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7" (OuterVolumeSpecName: "kube-api-access-mgdv7") pod "8850f5c5-318a-476c-8125-55bfcdc24d8b" (UID: "8850f5c5-318a-476c-8125-55bfcdc24d8b"). InnerVolumeSpecName "kube-api-access-mgdv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.639973 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8850f5c5-318a-476c-8125-55bfcdc24d8b" (UID: "8850f5c5-318a-476c-8125-55bfcdc24d8b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.640633 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8850f5c5-318a-476c-8125-55bfcdc24d8b" (UID: "8850f5c5-318a-476c-8125-55bfcdc24d8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.657521 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config" (OuterVolumeSpecName: "config") pod "8850f5c5-318a-476c-8125-55bfcdc24d8b" (UID: "8850f5c5-318a-476c-8125-55bfcdc24d8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.679084 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8850f5c5-318a-476c-8125-55bfcdc24d8b" (UID: "8850f5c5-318a-476c-8125-55bfcdc24d8b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.705438 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgdv7\" (UniqueName: \"kubernetes.io/projected/8850f5c5-318a-476c-8125-55bfcdc24d8b-kube-api-access-mgdv7\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.705493 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.705505 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.705518 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:32 crc kubenswrapper[4546]: I0201 06:56:32.705527 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8850f5c5-318a-476c-8125-55bfcdc24d8b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:33 crc kubenswrapper[4546]: I0201 06:56:33.442003 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db666964f-97rr4" Feb 01 06:56:33 crc kubenswrapper[4546]: I0201 06:56:33.472541 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:56:33 crc kubenswrapper[4546]: I0201 06:56:33.482016 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6db666964f-97rr4"] Feb 01 06:56:33 crc kubenswrapper[4546]: I0201 06:56:33.664086 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" path="/var/lib/kubelet/pods/8850f5c5-318a-476c-8125-55bfcdc24d8b/volumes" Feb 01 06:56:37 crc kubenswrapper[4546]: I0201 06:56:37.814500 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 01 06:56:37 crc kubenswrapper[4546]: I0201 06:56:37.904004 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161413 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hgpc7"] Feb 01 06:56:38 crc kubenswrapper[4546]: E0201 06:56:38.161709 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="dnsmasq-dns" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161726 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="dnsmasq-dns" Feb 01 06:56:38 crc kubenswrapper[4546]: E0201 06:56:38.161742 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="init" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161748 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="init" Feb 01 06:56:38 crc kubenswrapper[4546]: E0201 06:56:38.161756 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13413d4-5a29-4b1a-b717-76add98c9a19" containerName="ovn-config" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161762 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13413d4-5a29-4b1a-b717-76add98c9a19" containerName="ovn-config" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161915 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13413d4-5a29-4b1a-b717-76add98c9a19" containerName="ovn-config" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.161938 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8850f5c5-318a-476c-8125-55bfcdc24d8b" containerName="dnsmasq-dns" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.162417 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.173576 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hgpc7"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.277737 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-gv2cj"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.278584 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.301115 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gv2cj"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.307524 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d77-account-create-update-r85z2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.308296 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.310817 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.310877 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.314111 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.315053 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d77-account-create-update-r85z2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.393093 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-9kz7g"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.394118 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.398779 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-26c6-account-create-update-kqwq2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.399689 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.401121 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418085 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f45xg\" (UniqueName: \"kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418184 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418221 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418251 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418292 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.418336 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdjf\" (UniqueName: \"kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.419225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.421817 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-26c6-account-create-update-kqwq2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.426882 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9kz7g"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.454216 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp\") pod \"cinder-db-create-hgpc7\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.469281 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-230e-account-create-update-scdj6"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.476747 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.481529 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.486324 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.501638 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-230e-account-create-update-scdj6"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdjf\" (UniqueName: \"kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519739 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6lfj\" (UniqueName: \"kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519774 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f45xg\" (UniqueName: \"kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519796 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519818 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58gn\" (UniqueName: \"kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519897 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519963 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.519988 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.520839 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.524384 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.540309 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f45xg\" (UniqueName: \"kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg\") pod \"cinder-2d77-account-create-update-r85z2\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.543319 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdjf\" (UniqueName: \"kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf\") pod \"barbican-db-create-gv2cj\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.592842 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.621540 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.621773 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.621809 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58gn\" (UniqueName: \"kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.621874 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.621950 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrt2b\" (UniqueName: \"kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.622056 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.622130 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6lfj\" (UniqueName: \"kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.622821 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.622907 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.646909 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58gn\" (UniqueName: \"kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn\") pod \"barbican-26c6-account-create-update-kqwq2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.649577 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6lfj\" (UniqueName: \"kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj\") pod \"heat-db-create-9kz7g\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.674115 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-thwg2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.674923 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.701183 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-thwg2"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.712981 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.722117 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.723114 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrt2b\" (UniqueName: \"kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.723234 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.723758 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.764756 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c7gjn"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.766387 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.786743 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrt2b\" (UniqueName: \"kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b\") pod \"heat-230e-account-create-update-scdj6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.791981 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.792613 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.792759 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q48f2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.800538 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.802316 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.825760 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqms\" (UniqueName: \"kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.832818 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.829174 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f812-account-create-update-62t9x"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.834084 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.837361 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c7gjn"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.838921 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.845267 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f812-account-create-update-62t9x"] Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936106 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w96tq\" (UniqueName: \"kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936166 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936198 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936233 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936298 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbwwh\" (UniqueName: \"kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936340 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqms\" (UniqueName: \"kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.936370 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.937057 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:38 crc kubenswrapper[4546]: I0201 06:56:38.961653 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqms\" (UniqueName: \"kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms\") pod \"neutron-db-create-thwg2\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.014215 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039246 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039384 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w96tq\" (UniqueName: \"kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039441 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039499 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039667 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbwwh\" (UniqueName: \"kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.039999 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.043117 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.044489 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.054791 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w96tq\" (UniqueName: \"kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq\") pod \"keystone-db-sync-c7gjn\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.058020 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbwwh\" (UniqueName: \"kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh\") pod \"neutron-f812-account-create-update-62t9x\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.144944 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hgpc7"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.182237 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.192053 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.294403 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-gv2cj"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.360073 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-9kz7g"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.383899 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d77-account-create-update-r85z2"] Feb 01 06:56:39 crc kubenswrapper[4546]: W0201 06:56:39.413789 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a44a572_90bb_4589_b62d_7ffa43f490bc.slice/crio-fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07 WatchSource:0}: Error finding container fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07: Status 404 returned error can't find the container with id fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07 Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.438833 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-thwg2"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.526986 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-230e-account-create-update-scdj6"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.536578 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-thwg2" event={"ID":"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17","Type":"ContainerStarted","Data":"f040c5ae09934fd490ef200a3db5462308d6694fee30238239f2231e5af58dda"} Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.546228 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-26c6-account-create-update-kqwq2"] Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.547250 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9kz7g" event={"ID":"8a44a572-90bb-4589-b62d-7ffa43f490bc","Type":"ContainerStarted","Data":"fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07"} Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.549587 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d77-account-create-update-r85z2" event={"ID":"62626f59-4035-4cc1-bcfb-219e7782df0b","Type":"ContainerStarted","Data":"0a5634da7eb747240b224ec8b97f43bc51a43e47dcddb8bf99f93f07f72c7057"} Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.580374 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hgpc7" event={"ID":"fb2844a5-3270-4037-a106-bd22aa315e85","Type":"ContainerStarted","Data":"8069b798b09bf1091b67890bc14ad6e9b3bca3199e28ea8ce998f4c621a9e5b3"} Feb 01 06:56:39 crc kubenswrapper[4546]: I0201 06:56:39.586018 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gv2cj" event={"ID":"327fb651-55a3-4732-98be-4e36956c7ff0","Type":"ContainerStarted","Data":"006985fb83b6864c230f8d6aa1d26b48cb70b3caf4a2c5c3fc9257e7af3cf40a"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:39.803392 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c7gjn"] Feb 01 06:56:40 crc kubenswrapper[4546]: W0201 06:56:39.810963 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d22a574_dd06_478f_937c_6cec20a5777c.slice/crio-b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5 WatchSource:0}: Error finding container b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5: Status 404 returned error can't find the container with id b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:39.915301 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f812-account-create-update-62t9x"] Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.597129 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7gjn" event={"ID":"1d22a574-dd06-478f-937c-6cec20a5777c","Type":"ContainerStarted","Data":"b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.611846 4546 generic.go:334] "Generic (PLEG): container finished" podID="327fb651-55a3-4732-98be-4e36956c7ff0" containerID="56cc250e7e1e535454538883a5f716a1ab3060ee43ce7d3519d0cf3987409cb4" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.611905 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gv2cj" event={"ID":"327fb651-55a3-4732-98be-4e36956c7ff0","Type":"ContainerDied","Data":"56cc250e7e1e535454538883a5f716a1ab3060ee43ce7d3519d0cf3987409cb4"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.626312 4546 generic.go:334] "Generic (PLEG): container finished" podID="7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" containerID="ed60d86f7a325fd3673194488680e68299af922e70b3e097d01fa86d7357325b" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.626349 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-thwg2" event={"ID":"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17","Type":"ContainerDied","Data":"ed60d86f7a325fd3673194488680e68299af922e70b3e097d01fa86d7357325b"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.659468 4546 generic.go:334] "Generic (PLEG): container finished" podID="62626f59-4035-4cc1-bcfb-219e7782df0b" containerID="7139512aca28f1e82c87f008ae5b107d1d9fbf12a6b748751c0e3aace5ed7390" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.675415 4546 generic.go:334] "Generic (PLEG): container finished" podID="8a44a572-90bb-4589-b62d-7ffa43f490bc" containerID="69b1130ae2ae4499049d91bb6ff8e1d20696717a491e1fdfe94d53f21721dd03" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.695057 4546 generic.go:334] "Generic (PLEG): container finished" podID="9e6d2ae5-f55b-484f-bc46-615a464741f2" containerID="2f8f478fc42a4c5b40f2b8c9e8b63ccf464cc2944e337d903df0d33c1f4571fa" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.701968 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d77-account-create-update-r85z2" event={"ID":"62626f59-4035-4cc1-bcfb-219e7782df0b","Type":"ContainerDied","Data":"7139512aca28f1e82c87f008ae5b107d1d9fbf12a6b748751c0e3aace5ed7390"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.710918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9kz7g" event={"ID":"8a44a572-90bb-4589-b62d-7ffa43f490bc","Type":"ContainerDied","Data":"69b1130ae2ae4499049d91bb6ff8e1d20696717a491e1fdfe94d53f21721dd03"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.711763 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-26c6-account-create-update-kqwq2" event={"ID":"9e6d2ae5-f55b-484f-bc46-615a464741f2","Type":"ContainerDied","Data":"2f8f478fc42a4c5b40f2b8c9e8b63ccf464cc2944e337d903df0d33c1f4571fa"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.711833 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-26c6-account-create-update-kqwq2" event={"ID":"9e6d2ae5-f55b-484f-bc46-615a464741f2","Type":"ContainerStarted","Data":"32534f4b919305dfb43ab129a3a0ec03c942dc4d35587b542e5b19b0f4f2a889"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.723664 4546 generic.go:334] "Generic (PLEG): container finished" podID="fb2844a5-3270-4037-a106-bd22aa315e85" containerID="705fa0294061f140bf97c65a1f39858dfd393b235786d0643611d89db698e6a5" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.723772 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hgpc7" event={"ID":"fb2844a5-3270-4037-a106-bd22aa315e85","Type":"ContainerDied","Data":"705fa0294061f140bf97c65a1f39858dfd393b235786d0643611d89db698e6a5"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.738159 4546 generic.go:334] "Generic (PLEG): container finished" podID="13054411-2b0e-4c43-99c8-b10a5f7e6d07" containerID="41567098d151d3dad369b421cd0132a0ea4ba02628484b3a240766dfa076593e" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.738278 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f812-account-create-update-62t9x" event={"ID":"13054411-2b0e-4c43-99c8-b10a5f7e6d07","Type":"ContainerDied","Data":"41567098d151d3dad369b421cd0132a0ea4ba02628484b3a240766dfa076593e"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.738320 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f812-account-create-update-62t9x" event={"ID":"13054411-2b0e-4c43-99c8-b10a5f7e6d07","Type":"ContainerStarted","Data":"14327504011d9c8224cb96565730312d857b51b4c1461bdba3d39e5f6fe79a18"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.758881 4546 generic.go:334] "Generic (PLEG): container finished" podID="5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" containerID="3f0139c92efc216b33e1a122972ed585e59851589ef7bd375886afe857c61534" exitCode=0 Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.758924 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-230e-account-create-update-scdj6" event={"ID":"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6","Type":"ContainerDied","Data":"3f0139c92efc216b33e1a122972ed585e59851589ef7bd375886afe857c61534"} Feb 01 06:56:40 crc kubenswrapper[4546]: I0201 06:56:40.758949 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-230e-account-create-update-scdj6" event={"ID":"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6","Type":"ContainerStarted","Data":"7ceb91d35a4cae2c4d98fc3386b054daf626c45a8ef7b6e8d64e124e3624bfba"} Feb 01 06:56:41 crc kubenswrapper[4546]: E0201 06:56:41.557498 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:47260->192.168.26.196:40843: write tcp 192.168.26.196:47260->192.168.26.196:40843: write: broken pipe Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.679724 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.681551 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.696258 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.825506 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rsqn\" (UniqueName: \"kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.825561 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.825584 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.927844 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rsqn\" (UniqueName: \"kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.927901 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.927929 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.928314 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.928767 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:41 crc kubenswrapper[4546]: I0201 06:56:41.945572 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rsqn\" (UniqueName: \"kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn\") pod \"redhat-operators-wdfhd\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:42 crc kubenswrapper[4546]: I0201 06:56:42.009203 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.058215 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.062473 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.076985 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.190277 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.190345 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhcz\" (UniqueName: \"kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.190432 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.254084 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.256772 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.277067 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.293948 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.294127 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.294208 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhcz\" (UniqueName: \"kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.294938 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.296747 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.314924 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhcz\" (UniqueName: \"kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz\") pod \"redhat-marketplace-rnnqr\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.387849 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.396553 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.396684 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.396773 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8lw\" (UniqueName: \"kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.497954 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8lw\" (UniqueName: \"kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.498051 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.498434 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.498500 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.498731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.517078 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8lw\" (UniqueName: \"kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw\") pod \"community-operators-5n264\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:45 crc kubenswrapper[4546]: I0201 06:56:45.572170 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.328526 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.333734 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.451599 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts\") pod \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.451710 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c58gn\" (UniqueName: \"kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn\") pod \"9e6d2ae5-f55b-484f-bc46-615a464741f2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.451794 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrt2b\" (UniqueName: \"kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b\") pod \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\" (UID: \"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6\") " Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.451828 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts\") pod \"9e6d2ae5-f55b-484f-bc46-615a464741f2\" (UID: \"9e6d2ae5-f55b-484f-bc46-615a464741f2\") " Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.452572 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" (UID: "5f61fd99-8903-4fd1-a3ce-2c669ff13bd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.452609 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e6d2ae5-f55b-484f-bc46-615a464741f2" (UID: "9e6d2ae5-f55b-484f-bc46-615a464741f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.461163 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b" (OuterVolumeSpecName: "kube-api-access-jrt2b") pod "5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" (UID: "5f61fd99-8903-4fd1-a3ce-2c669ff13bd6"). InnerVolumeSpecName "kube-api-access-jrt2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.461340 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn" (OuterVolumeSpecName: "kube-api-access-c58gn") pod "9e6d2ae5-f55b-484f-bc46-615a464741f2" (UID: "9e6d2ae5-f55b-484f-bc46-615a464741f2"). InnerVolumeSpecName "kube-api-access-c58gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.553535 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.553570 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c58gn\" (UniqueName: \"kubernetes.io/projected/9e6d2ae5-f55b-484f-bc46-615a464741f2-kube-api-access-c58gn\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.553585 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrt2b\" (UniqueName: \"kubernetes.io/projected/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6-kube-api-access-jrt2b\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.553596 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d2ae5-f55b-484f-bc46-615a464741f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.848634 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-230e-account-create-update-scdj6" event={"ID":"5f61fd99-8903-4fd1-a3ce-2c669ff13bd6","Type":"ContainerDied","Data":"7ceb91d35a4cae2c4d98fc3386b054daf626c45a8ef7b6e8d64e124e3624bfba"} Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.849042 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ceb91d35a4cae2c4d98fc3386b054daf626c45a8ef7b6e8d64e124e3624bfba" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.848674 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-230e-account-create-update-scdj6" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.851139 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-26c6-account-create-update-kqwq2" event={"ID":"9e6d2ae5-f55b-484f-bc46-615a464741f2","Type":"ContainerDied","Data":"32534f4b919305dfb43ab129a3a0ec03c942dc4d35587b542e5b19b0f4f2a889"} Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.851174 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32534f4b919305dfb43ab129a3a0ec03c942dc4d35587b542e5b19b0f4f2a889" Feb 01 06:56:48 crc kubenswrapper[4546]: I0201 06:56:48.851225 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-26c6-account-create-update-kqwq2" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.910583 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f812-account-create-update-62t9x" event={"ID":"13054411-2b0e-4c43-99c8-b10a5f7e6d07","Type":"ContainerDied","Data":"14327504011d9c8224cb96565730312d857b51b4c1461bdba3d39e5f6fe79a18"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.911155 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14327504011d9c8224cb96565730312d857b51b4c1461bdba3d39e5f6fe79a18" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.922054 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-gv2cj" event={"ID":"327fb651-55a3-4732-98be-4e36956c7ff0","Type":"ContainerDied","Data":"006985fb83b6864c230f8d6aa1d26b48cb70b3caf4a2c5c3fc9257e7af3cf40a"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.922091 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="006985fb83b6864c230f8d6aa1d26b48cb70b3caf4a2c5c3fc9257e7af3cf40a" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.932675 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-thwg2" event={"ID":"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17","Type":"ContainerDied","Data":"f040c5ae09934fd490ef200a3db5462308d6694fee30238239f2231e5af58dda"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.932702 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f040c5ae09934fd490ef200a3db5462308d6694fee30238239f2231e5af58dda" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.934703 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-9kz7g" event={"ID":"8a44a572-90bb-4589-b62d-7ffa43f490bc","Type":"ContainerDied","Data":"fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.934724 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec644fec9190b7a1aed0e31df919c949d12d24808afc1bff1dc237be00e7e07" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.938032 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d77-account-create-update-r85z2" event={"ID":"62626f59-4035-4cc1-bcfb-219e7782df0b","Type":"ContainerDied","Data":"0a5634da7eb747240b224ec8b97f43bc51a43e47dcddb8bf99f93f07f72c7057"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.938062 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a5634da7eb747240b224ec8b97f43bc51a43e47dcddb8bf99f93f07f72c7057" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.955161 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.955589 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hgpc7" event={"ID":"fb2844a5-3270-4037-a106-bd22aa315e85","Type":"ContainerDied","Data":"8069b798b09bf1091b67890bc14ad6e9b3bca3199e28ea8ce998f4c621a9e5b3"} Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.955645 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8069b798b09bf1091b67890bc14ad6e9b3bca3199e28ea8ce998f4c621a9e5b3" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.957368 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.963823 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.977291 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.984544 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:50 crc kubenswrapper[4546]: I0201 06:56:50.999518 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105652 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdjf\" (UniqueName: \"kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf\") pod \"327fb651-55a3-4732-98be-4e36956c7ff0\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105712 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts\") pod \"8a44a572-90bb-4589-b62d-7ffa43f490bc\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105742 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts\") pod \"327fb651-55a3-4732-98be-4e36956c7ff0\" (UID: \"327fb651-55a3-4732-98be-4e36956c7ff0\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105762 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts\") pod \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105789 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp\") pod \"fb2844a5-3270-4037-a106-bd22aa315e85\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105827 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f45xg\" (UniqueName: \"kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg\") pod \"62626f59-4035-4cc1-bcfb-219e7782df0b\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105918 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts\") pod \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctqms\" (UniqueName: \"kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms\") pod \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\" (UID: \"7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.105996 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbwwh\" (UniqueName: \"kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh\") pod \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\" (UID: \"13054411-2b0e-4c43-99c8-b10a5f7e6d07\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.106060 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts\") pod \"fb2844a5-3270-4037-a106-bd22aa315e85\" (UID: \"fb2844a5-3270-4037-a106-bd22aa315e85\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.106088 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts\") pod \"62626f59-4035-4cc1-bcfb-219e7782df0b\" (UID: \"62626f59-4035-4cc1-bcfb-219e7782df0b\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.106125 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6lfj\" (UniqueName: \"kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj\") pod \"8a44a572-90bb-4589-b62d-7ffa43f490bc\" (UID: \"8a44a572-90bb-4589-b62d-7ffa43f490bc\") " Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.106360 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "327fb651-55a3-4732-98be-4e36956c7ff0" (UID: "327fb651-55a3-4732-98be-4e36956c7ff0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.106834 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/327fb651-55a3-4732-98be-4e36956c7ff0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.107261 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a44a572-90bb-4589-b62d-7ffa43f490bc" (UID: "8a44a572-90bb-4589-b62d-7ffa43f490bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.107339 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13054411-2b0e-4c43-99c8-b10a5f7e6d07" (UID: "13054411-2b0e-4c43-99c8-b10a5f7e6d07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.107945 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62626f59-4035-4cc1-bcfb-219e7782df0b" (UID: "62626f59-4035-4cc1-bcfb-219e7782df0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.107968 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" (UID: "7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.108470 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb2844a5-3270-4037-a106-bd22aa315e85" (UID: "fb2844a5-3270-4037-a106-bd22aa315e85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.113615 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg" (OuterVolumeSpecName: "kube-api-access-f45xg") pod "62626f59-4035-4cc1-bcfb-219e7782df0b" (UID: "62626f59-4035-4cc1-bcfb-219e7782df0b"). InnerVolumeSpecName "kube-api-access-f45xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.117298 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf" (OuterVolumeSpecName: "kube-api-access-lfdjf") pod "327fb651-55a3-4732-98be-4e36956c7ff0" (UID: "327fb651-55a3-4732-98be-4e36956c7ff0"). InnerVolumeSpecName "kube-api-access-lfdjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.117446 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj" (OuterVolumeSpecName: "kube-api-access-x6lfj") pod "8a44a572-90bb-4589-b62d-7ffa43f490bc" (UID: "8a44a572-90bb-4589-b62d-7ffa43f490bc"). InnerVolumeSpecName "kube-api-access-x6lfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.119266 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh" (OuterVolumeSpecName: "kube-api-access-hbwwh") pod "13054411-2b0e-4c43-99c8-b10a5f7e6d07" (UID: "13054411-2b0e-4c43-99c8-b10a5f7e6d07"). InnerVolumeSpecName "kube-api-access-hbwwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.125596 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp" (OuterVolumeSpecName: "kube-api-access-z4vcp") pod "fb2844a5-3270-4037-a106-bd22aa315e85" (UID: "fb2844a5-3270-4037-a106-bd22aa315e85"). InnerVolumeSpecName "kube-api-access-z4vcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.125668 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms" (OuterVolumeSpecName: "kube-api-access-ctqms") pod "7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" (UID: "7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17"). InnerVolumeSpecName "kube-api-access-ctqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.195354 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208619 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6lfj\" (UniqueName: \"kubernetes.io/projected/8a44a572-90bb-4589-b62d-7ffa43f490bc-kube-api-access-x6lfj\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208639 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdjf\" (UniqueName: \"kubernetes.io/projected/327fb651-55a3-4732-98be-4e36956c7ff0-kube-api-access-lfdjf\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208650 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a44a572-90bb-4589-b62d-7ffa43f490bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208659 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208669 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/fb2844a5-3270-4037-a106-bd22aa315e85-kube-api-access-z4vcp\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208679 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f45xg\" (UniqueName: \"kubernetes.io/projected/62626f59-4035-4cc1-bcfb-219e7782df0b-kube-api-access-f45xg\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208689 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13054411-2b0e-4c43-99c8-b10a5f7e6d07-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208698 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctqms\" (UniqueName: \"kubernetes.io/projected/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17-kube-api-access-ctqms\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208707 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbwwh\" (UniqueName: \"kubernetes.io/projected/13054411-2b0e-4c43-99c8-b10a5f7e6d07-kube-api-access-hbwwh\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208715 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb2844a5-3270-4037-a106-bd22aa315e85-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.208725 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62626f59-4035-4cc1-bcfb-219e7782df0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.403181 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.413977 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:56:51 crc kubenswrapper[4546]: W0201 06:56:51.418167 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a778d80_9088_4ea7_82fc_8c2ff4e0ba9c.slice/crio-1fda9466eeb7423cd7c1b5514e6212c6f978cc263f7f7ce5387090f1f295bf7f WatchSource:0}: Error finding container 1fda9466eeb7423cd7c1b5514e6212c6f978cc263f7f7ce5387090f1f295bf7f: Status 404 returned error can't find the container with id 1fda9466eeb7423cd7c1b5514e6212c6f978cc263f7f7ce5387090f1f295bf7f Feb 01 06:56:51 crc kubenswrapper[4546]: W0201 06:56:51.424591 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96495570_944a_41ba_88cb_e251b822c062.slice/crio-86b4306a967b6114576efaf2b4e3043b38719535ad94934d2320e2cf30fb7b6b WatchSource:0}: Error finding container 86b4306a967b6114576efaf2b4e3043b38719535ad94934d2320e2cf30fb7b6b: Status 404 returned error can't find the container with id 86b4306a967b6114576efaf2b4e3043b38719535ad94934d2320e2cf30fb7b6b Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.452583 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.452935 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62626f59-4035-4cc1-bcfb-219e7782df0b" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.452954 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="62626f59-4035-4cc1-bcfb-219e7782df0b" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.452971 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.452978 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.452989 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13054411-2b0e-4c43-99c8-b10a5f7e6d07" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.452995 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="13054411-2b0e-4c43-99c8-b10a5f7e6d07" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.453002 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453008 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.453020 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a44a572-90bb-4589-b62d-7ffa43f490bc" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453025 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a44a572-90bb-4589-b62d-7ffa43f490bc" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.453038 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2844a5-3270-4037-a106-bd22aa315e85" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453044 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2844a5-3270-4037-a106-bd22aa315e85" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.453059 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6d2ae5-f55b-484f-bc46-615a464741f2" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453066 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6d2ae5-f55b-484f-bc46-615a464741f2" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: E0201 06:56:51.453075 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327fb651-55a3-4732-98be-4e36956c7ff0" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453081 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="327fb651-55a3-4732-98be-4e36956c7ff0" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453221 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="13054411-2b0e-4c43-99c8-b10a5f7e6d07" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453233 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453240 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a44a572-90bb-4589-b62d-7ffa43f490bc" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453250 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6d2ae5-f55b-484f-bc46-615a464741f2" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453261 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="62626f59-4035-4cc1-bcfb-219e7782df0b" containerName="mariadb-account-create-update" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453270 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2844a5-3270-4037-a106-bd22aa315e85" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453279 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="327fb651-55a3-4732-98be-4e36956c7ff0" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.453288 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" containerName="mariadb-database-create" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.462878 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.464077 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.616394 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.616715 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.616789 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghw2\" (UniqueName: \"kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.718637 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.718699 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.718772 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghw2\" (UniqueName: \"kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.719092 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.719138 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.738287 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghw2\" (UniqueName: \"kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2\") pod \"certified-operators-nmb7h\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.805348 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.979590 4546 generic.go:334] "Generic (PLEG): container finished" podID="24868290-5ac4-46f3-a91d-2023c92666e6" containerID="b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f" exitCode=0 Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.980540 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerDied","Data":"b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f"} Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.980567 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerStarted","Data":"87f2ae2e44b732897454ce7134e537417146d212412fb7ceaa094a722e7bdc3d"} Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.992028 4546 generic.go:334] "Generic (PLEG): container finished" podID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerID="8f09abbe04bbc350aa0c372d9fc629b50cb2b8b57ec024a81c2c914d060c42ff" exitCode=0 Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.992143 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerDied","Data":"8f09abbe04bbc350aa0c372d9fc629b50cb2b8b57ec024a81c2c914d060c42ff"} Feb 01 06:56:51 crc kubenswrapper[4546]: I0201 06:56:51.992199 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerStarted","Data":"1fda9466eeb7423cd7c1b5514e6212c6f978cc263f7f7ce5387090f1f295bf7f"} Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.000173 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7gjn" event={"ID":"1d22a574-dd06-478f-937c-6cec20a5777c","Type":"ContainerStarted","Data":"e56b7a4aa8f6dcd5d20db3ec6730c32b18bf116f3bc1a0d4982702a6ef39fc61"} Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.005600 4546 generic.go:334] "Generic (PLEG): container finished" podID="96495570-944a-41ba-88cb-e251b822c062" containerID="8085beef3a5ccfc945432735f3293c03b0145982c5d29519394b0a078eff7e71" exitCode=0 Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.005687 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerDied","Data":"8085beef3a5ccfc945432735f3293c03b0145982c5d29519394b0a078eff7e71"} Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.005722 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerStarted","Data":"86b4306a967b6114576efaf2b4e3043b38719535ad94934d2320e2cf30fb7b6b"} Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.012379 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-thwg2" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.012579 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d77-account-create-update-r85z2" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.012900 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hgpc7" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.013214 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-9kz7g" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.013494 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f812-account-create-update-62t9x" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.013761 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-gv2cj" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.014389 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sscjj" event={"ID":"2bf01534-1b7d-4f23-bc2c-02cb329a2036","Type":"ContainerStarted","Data":"9e86461b8024e892cac94f2fcccea6cdb576941b61c420446695ed6de77ab5c0"} Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.070816 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sscjj" podStartSLOduration=2.437659809 podStartE2EDuration="41.070793087s" podCreationTimestamp="2026-02-01 06:56:11 +0000 UTC" firstStartedPulling="2026-02-01 06:56:12.189769395 +0000 UTC m=+802.840705411" lastFinishedPulling="2026-02-01 06:56:50.822902672 +0000 UTC m=+841.473838689" observedRunningTime="2026-02-01 06:56:52.033239196 +0000 UTC m=+842.684175213" watchObservedRunningTime="2026-02-01 06:56:52.070793087 +0000 UTC m=+842.721729102" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.201649 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c7gjn" podStartSLOduration=3.2442471 podStartE2EDuration="14.201630631s" podCreationTimestamp="2026-02-01 06:56:38 +0000 UTC" firstStartedPulling="2026-02-01 06:56:39.813419302 +0000 UTC m=+830.464355318" lastFinishedPulling="2026-02-01 06:56:50.770802833 +0000 UTC m=+841.421738849" observedRunningTime="2026-02-01 06:56:52.078947979 +0000 UTC m=+842.729883995" watchObservedRunningTime="2026-02-01 06:56:52.201630631 +0000 UTC m=+842.852566647" Feb 01 06:56:52 crc kubenswrapper[4546]: I0201 06:56:52.214571 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.030234 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerStarted","Data":"9b0e71c07f08fb962496de3dfe84f84792b33a4ca949f820e59860e2d09e1d8f"} Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.033222 4546 generic.go:334] "Generic (PLEG): container finished" podID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerID="4af2fb52b3c3c8f1e22ecec2409f13938bb3ed88af61e88c2cf0e3b79d7c102c" exitCode=0 Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.033319 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerDied","Data":"4af2fb52b3c3c8f1e22ecec2409f13938bb3ed88af61e88c2cf0e3b79d7c102c"} Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.033373 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerStarted","Data":"e845270320bcc47eae8baf5a276ed391b7aab05be92a676a95d0f6d5c208efe7"} Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.039672 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerStarted","Data":"fce5c0bd3b9c03c122cf7625a7efd9e32da5c57ee952db510965ae7dac15d9cc"} Feb 01 06:56:53 crc kubenswrapper[4546]: I0201 06:56:53.042081 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerStarted","Data":"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e"} Feb 01 06:56:54 crc kubenswrapper[4546]: I0201 06:56:54.053107 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerStarted","Data":"8d7a360baf46332323a6ee69d6a6a60b8132bd387a7109471db6ea3cc3774242"} Feb 01 06:56:55 crc kubenswrapper[4546]: I0201 06:56:55.065434 4546 generic.go:334] "Generic (PLEG): container finished" podID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerID="9b0e71c07f08fb962496de3dfe84f84792b33a4ca949f820e59860e2d09e1d8f" exitCode=0 Feb 01 06:56:55 crc kubenswrapper[4546]: I0201 06:56:55.065523 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerDied","Data":"9b0e71c07f08fb962496de3dfe84f84792b33a4ca949f820e59860e2d09e1d8f"} Feb 01 06:56:55 crc kubenswrapper[4546]: I0201 06:56:55.068966 4546 generic.go:334] "Generic (PLEG): container finished" podID="24868290-5ac4-46f3-a91d-2023c92666e6" containerID="3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e" exitCode=0 Feb 01 06:56:55 crc kubenswrapper[4546]: I0201 06:56:55.069539 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerDied","Data":"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e"} Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.075900 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerStarted","Data":"946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426"} Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.079598 4546 generic.go:334] "Generic (PLEG): container finished" podID="96495570-944a-41ba-88cb-e251b822c062" containerID="fce5c0bd3b9c03c122cf7625a7efd9e32da5c57ee952db510965ae7dac15d9cc" exitCode=0 Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.079686 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerDied","Data":"fce5c0bd3b9c03c122cf7625a7efd9e32da5c57ee952db510965ae7dac15d9cc"} Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.083001 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerStarted","Data":"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1"} Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.110929 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5n264" podStartSLOduration=7.54185144 podStartE2EDuration="11.110912391s" podCreationTimestamp="2026-02-01 06:56:45 +0000 UTC" firstStartedPulling="2026-02-01 06:56:51.998082986 +0000 UTC m=+842.649019001" lastFinishedPulling="2026-02-01 06:56:55.567143937 +0000 UTC m=+846.218079952" observedRunningTime="2026-02-01 06:56:56.102785411 +0000 UTC m=+846.753721427" watchObservedRunningTime="2026-02-01 06:56:56.110912391 +0000 UTC m=+846.761848407" Feb 01 06:56:56 crc kubenswrapper[4546]: I0201 06:56:56.152648 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rnnqr" podStartSLOduration=7.612846082 podStartE2EDuration="11.152614543s" podCreationTimestamp="2026-02-01 06:56:45 +0000 UTC" firstStartedPulling="2026-02-01 06:56:51.98168221 +0000 UTC m=+842.632618226" lastFinishedPulling="2026-02-01 06:56:55.521450671 +0000 UTC m=+846.172386687" observedRunningTime="2026-02-01 06:56:56.149200488 +0000 UTC m=+846.800136504" watchObservedRunningTime="2026-02-01 06:56:56.152614543 +0000 UTC m=+846.803550559" Feb 01 06:56:57 crc kubenswrapper[4546]: I0201 06:56:57.093722 4546 generic.go:334] "Generic (PLEG): container finished" podID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerID="8d7a360baf46332323a6ee69d6a6a60b8132bd387a7109471db6ea3cc3774242" exitCode=0 Feb 01 06:56:57 crc kubenswrapper[4546]: I0201 06:56:57.093806 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerDied","Data":"8d7a360baf46332323a6ee69d6a6a60b8132bd387a7109471db6ea3cc3774242"} Feb 01 06:56:57 crc kubenswrapper[4546]: I0201 06:56:57.098232 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerStarted","Data":"9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a"} Feb 01 06:56:57 crc kubenswrapper[4546]: I0201 06:56:57.137056 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdfhd" podStartSLOduration=11.564989506 podStartE2EDuration="16.137035177s" podCreationTimestamp="2026-02-01 06:56:41 +0000 UTC" firstStartedPulling="2026-02-01 06:56:52.007404277 +0000 UTC m=+842.658340292" lastFinishedPulling="2026-02-01 06:56:56.579449948 +0000 UTC m=+847.230385963" observedRunningTime="2026-02-01 06:56:57.135741299 +0000 UTC m=+847.786677315" watchObservedRunningTime="2026-02-01 06:56:57.137035177 +0000 UTC m=+847.787971193" Feb 01 06:56:58 crc kubenswrapper[4546]: I0201 06:56:58.108063 4546 generic.go:334] "Generic (PLEG): container finished" podID="1d22a574-dd06-478f-937c-6cec20a5777c" containerID="e56b7a4aa8f6dcd5d20db3ec6730c32b18bf116f3bc1a0d4982702a6ef39fc61" exitCode=0 Feb 01 06:56:58 crc kubenswrapper[4546]: I0201 06:56:58.108158 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7gjn" event={"ID":"1d22a574-dd06-478f-937c-6cec20a5777c","Type":"ContainerDied","Data":"e56b7a4aa8f6dcd5d20db3ec6730c32b18bf116f3bc1a0d4982702a6ef39fc61"} Feb 01 06:56:58 crc kubenswrapper[4546]: I0201 06:56:58.112598 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerStarted","Data":"d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca"} Feb 01 06:56:58 crc kubenswrapper[4546]: I0201 06:56:58.151191 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nmb7h" podStartSLOduration=2.625216359 podStartE2EDuration="7.151169505s" podCreationTimestamp="2026-02-01 06:56:51 +0000 UTC" firstStartedPulling="2026-02-01 06:56:53.034363677 +0000 UTC m=+843.685299693" lastFinishedPulling="2026-02-01 06:56:57.560316823 +0000 UTC m=+848.211252839" observedRunningTime="2026-02-01 06:56:58.138793816 +0000 UTC m=+848.789729831" watchObservedRunningTime="2026-02-01 06:56:58.151169505 +0000 UTC m=+848.802105521" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.404166 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.490128 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data\") pod \"1d22a574-dd06-478f-937c-6cec20a5777c\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.490173 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle\") pod \"1d22a574-dd06-478f-937c-6cec20a5777c\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.490232 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w96tq\" (UniqueName: \"kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq\") pod \"1d22a574-dd06-478f-937c-6cec20a5777c\" (UID: \"1d22a574-dd06-478f-937c-6cec20a5777c\") " Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.512168 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq" (OuterVolumeSpecName: "kube-api-access-w96tq") pod "1d22a574-dd06-478f-937c-6cec20a5777c" (UID: "1d22a574-dd06-478f-937c-6cec20a5777c"). InnerVolumeSpecName "kube-api-access-w96tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.531693 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data" (OuterVolumeSpecName: "config-data") pod "1d22a574-dd06-478f-937c-6cec20a5777c" (UID: "1d22a574-dd06-478f-937c-6cec20a5777c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.534068 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d22a574-dd06-478f-937c-6cec20a5777c" (UID: "1d22a574-dd06-478f-937c-6cec20a5777c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.593105 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.593482 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d22a574-dd06-478f-937c-6cec20a5777c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:59 crc kubenswrapper[4546]: I0201 06:56:59.593567 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w96tq\" (UniqueName: \"kubernetes.io/projected/1d22a574-dd06-478f-937c-6cec20a5777c-kube-api-access-w96tq\") on node \"crc\" DevicePath \"\"" Feb 01 06:56:59 crc kubenswrapper[4546]: E0201 06:56:59.777717 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d22a574_dd06_478f_937c_6cec20a5777c.slice/crio-b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d22a574_dd06_478f_937c_6cec20a5777c.slice\": RecentStats: unable to find data in memory cache]" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.146242 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c7gjn" event={"ID":"1d22a574-dd06-478f-937c-6cec20a5777c","Type":"ContainerDied","Data":"b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5"} Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.146292 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c5fd8a262da5a5dacb1784efd8335773dfe3db255e1ded9c21466868e48ae5" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.146367 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c7gjn" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.361157 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:00 crc kubenswrapper[4546]: E0201 06:57:00.361622 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d22a574-dd06-478f-937c-6cec20a5777c" containerName="keystone-db-sync" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.361640 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d22a574-dd06-478f-937c-6cec20a5777c" containerName="keystone-db-sync" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.361823 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d22a574-dd06-478f-937c-6cec20a5777c" containerName="keystone-db-sync" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.362644 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.374744 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9rq8w"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.375690 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.378687 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.380355 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.380581 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q48f2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.380731 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.380850 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.427885 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.445156 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9rq8w"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.497918 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-6ktch"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.499199 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.504066 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6ktch"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.504830 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-72t6p" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.505479 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.513950 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.513990 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514024 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514050 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514067 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb58\" (UniqueName: \"kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514094 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klsrs\" (UniqueName: \"kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514140 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514163 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514175 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514205 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514227 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.514246 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.591958 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b9btc"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.594218 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.605294 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jn4mm" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.605611 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.605632 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618333 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618503 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618587 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618663 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618736 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4g7\" (UniqueName: \"kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618812 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.618910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619116 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619184 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb58\" (UniqueName: \"kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619253 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klsrs\" (UniqueName: \"kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619327 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619424 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619505 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619575 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.619634 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.622118 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.623088 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.627333 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.631556 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.631677 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.634296 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.637251 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.637721 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.638077 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.638296 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.638774 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.639100 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jvjtj" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.639652 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.645372 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.651327 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.669443 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.695187 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.698674 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.702958 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klsrs\" (UniqueName: \"kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs\") pod \"dnsmasq-dns-646574d479-zlxzl\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.703337 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.703545 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.705003 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb58\" (UniqueName: \"kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58\") pod \"keystone-bootstrap-9rq8w\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.725988 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4g7\" (UniqueName: \"kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726042 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726072 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726092 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726139 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726175 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvqs\" (UniqueName: \"kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726192 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726212 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726229 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726248 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726278 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726295 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726337 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726401 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvsmv\" (UniqueName: \"kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.726964 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.736084 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.742326 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.744950 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b9btc"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.758426 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4g7\" (UniqueName: \"kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7\") pod \"heat-db-sync-6ktch\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.760953 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.768097 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cd5px"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.769407 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.771875 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5ndh" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.772108 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.772271 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.780913 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cd5px"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.822315 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6ktch" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827334 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827380 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827399 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827416 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvqs\" (UniqueName: \"kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827457 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827472 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827509 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827527 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827545 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827584 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827603 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827631 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvsmv\" (UniqueName: \"kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827646 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827663 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827687 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqct\" (UniqueName: \"kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827739 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827757 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.827772 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.830246 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.831414 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.833189 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.833382 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.833432 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.833645 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.834313 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.836587 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.838304 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.877253 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvsmv\" (UniqueName: \"kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv\") pod \"horizon-55cb447c8f-m8jw2\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.879729 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-qjczq"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.880934 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.884287 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.884480 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.892314 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xcwwh" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.900718 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvqs\" (UniqueName: \"kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs\") pod \"cinder-db-sync-b9btc\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.919435 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qjczq"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.924768 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.925586 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.928888 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.928948 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929011 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929046 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929065 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929088 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5v4h\" (UniqueName: \"kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929114 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929137 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929153 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.929170 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqct\" (UniqueName: \"kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.930069 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.934597 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.940399 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.942421 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.946906 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.947452 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.961506 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pgw6x"] Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.962666 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.963435 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqct\" (UniqueName: \"kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct\") pod \"ceilometer-0\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " pod="openstack/ceilometer-0" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.965482 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k2llf" Feb 01 06:57:00 crc kubenswrapper[4546]: I0201 06:57:00.965648 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.000086 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.007136 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pgw6x"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.027217 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.029668 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.030958 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28gf\" (UniqueName: \"kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031003 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6wvv\" (UniqueName: \"kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031022 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031056 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v4h\" (UniqueName: \"kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031076 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031091 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031105 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031131 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031156 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031185 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.031199 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.040417 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.043521 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.059834 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.077250 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v4h\" (UniqueName: \"kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h\") pod \"neutron-db-sync-cd5px\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.079697 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b9btc" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.091426 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.092126 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.099880 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.113664 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.118511 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137417 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137453 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28gf\" (UniqueName: \"kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137475 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137516 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6wvv\" (UniqueName: \"kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137533 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137548 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137577 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137597 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137612 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137663 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137686 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137710 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkgfh\" (UniqueName: \"kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137731 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.137749 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.150716 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.157035 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.157391 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.158909 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.159495 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.172540 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.173275 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6wvv\" (UniqueName: \"kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv\") pod \"placement-db-sync-qjczq\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.173591 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28gf\" (UniqueName: \"kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf\") pod \"barbican-db-sync-pgw6x\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.178546 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.219707 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.244530 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.246714 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247251 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247411 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247511 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247631 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t2wq\" (UniqueName: \"kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247823 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.247986 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkgfh\" (UniqueName: \"kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.250028 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.250082 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.250152 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.250827 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.248292 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.248563 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.251428 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.251539 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.276530 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkgfh\" (UniqueName: \"kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh\") pod \"dnsmasq-dns-6f7b9bf65-z52b6\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.327971 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.352212 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.352266 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.352316 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.352350 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t2wq\" (UniqueName: \"kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.352381 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.354237 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.354877 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.355329 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.367270 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.380328 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t2wq\" (UniqueName: \"kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq\") pod \"horizon-fc6d5f569-qh985\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.451382 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.467196 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-6ktch"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.515649 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:01 crc kubenswrapper[4546]: W0201 06:57:01.550426 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b4a2956_c177_42f3_8981_830dbac77943.slice/crio-0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf WatchSource:0}: Error finding container 0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf: Status 404 returned error can't find the container with id 0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.803665 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9rq8w"] Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.806728 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.807214 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:01 crc kubenswrapper[4546]: W0201 06:57:01.810242 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d27dc5f_832f_4e8a_aea4_eed121c9e07c.slice/crio-a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230 WatchSource:0}: Error finding container a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230: Status 404 returned error can't find the container with id a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230 Feb 01 06:57:01 crc kubenswrapper[4546]: I0201 06:57:01.878249 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.012136 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.014903 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.111752 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.134531 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b9btc"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.187055 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.279248 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6ktch" event={"ID":"8b4a2956-c177-42f3-8981-830dbac77943","Type":"ContainerStarted","Data":"0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.289458 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9rq8w" event={"ID":"2d27dc5f-832f-4e8a-aea4-eed121c9e07c","Type":"ContainerStarted","Data":"123f4bf90781e8a1b921df34730c8727ba577ffffeb26b87a686fa6a3fd0d2f9"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.291379 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9rq8w" event={"ID":"2d27dc5f-832f-4e8a-aea4-eed121c9e07c","Type":"ContainerStarted","Data":"a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.299312 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b9btc" event={"ID":"59c89483-60db-4db0-8957-32962d2a73b1","Type":"ContainerStarted","Data":"cb6ea4acf4ed0b6c1543d8a08b901f0c6091063173bc75fca93c706e924b2753"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.304562 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qjczq"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.313628 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.326618 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646574d479-zlxzl" event={"ID":"565c1975-dd8f-418f-87ea-5f836ee42c5b","Type":"ContainerStarted","Data":"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.326667 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646574d479-zlxzl" event={"ID":"565c1975-dd8f-418f-87ea-5f836ee42c5b","Type":"ContainerStarted","Data":"3ff98d9f6a26eb864931a28e780d19faa85759ef12513867031c7a96539e50ff"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.331760 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerStarted","Data":"3f6c8429cc258b36dfd36a3f0121da74e51f90ccb6c7fd613a62003bcf30e6a0"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.343954 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pgw6x"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.352893 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cd5px"] Feb 01 06:57:02 crc kubenswrapper[4546]: W0201 06:57:02.353825 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3d5062_dd75_4ae1_b89e_010bfbd99a01.slice/crio-ab677ceaa12ab9ba24c66d67f972d299a9f64066706e32332cdf03bf2e1ef0f1 WatchSource:0}: Error finding container ab677ceaa12ab9ba24c66d67f972d299a9f64066706e32332cdf03bf2e1ef0f1: Status 404 returned error can't find the container with id ab677ceaa12ab9ba24c66d67f972d299a9f64066706e32332cdf03bf2e1ef0f1 Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.355961 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerStarted","Data":"e5882db122c3bf6a7b1aaede3a104dc53ba6a06ceade3323ee4c5184a60859d6"} Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.360215 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.379643 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9rq8w" podStartSLOduration=2.379625347 podStartE2EDuration="2.379625347s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:02.326241874 +0000 UTC m=+852.977177890" watchObservedRunningTime="2026-02-01 06:57:02.379625347 +0000 UTC m=+853.030561363" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.709701 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.800938 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.802413 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.827443 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.835144 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.888870 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nmb7h" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" probeResult="failure" output=< Feb 01 06:57:02 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 06:57:02 crc kubenswrapper[4546]: > Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.914769 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925341 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925375 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klsrs\" (UniqueName: \"kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925412 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925464 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925605 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.925748 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0\") pod \"565c1975-dd8f-418f-87ea-5f836ee42c5b\" (UID: \"565c1975-dd8f-418f-87ea-5f836ee42c5b\") " Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.926085 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.926180 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.926216 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.926411 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgcn\" (UniqueName: \"kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.926440 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.939421 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs" (OuterVolumeSpecName: "kube-api-access-klsrs") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "kube-api-access-klsrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.964102 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.966401 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.966416 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config" (OuterVolumeSpecName: "config") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.981342 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:02 crc kubenswrapper[4546]: I0201 06:57:02.983288 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "565c1975-dd8f-418f-87ea-5f836ee42c5b" (UID: "565c1975-dd8f-418f-87ea-5f836ee42c5b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029182 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029272 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029296 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029396 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgcn\" (UniqueName: \"kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029424 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029477 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029497 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029507 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029515 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klsrs\" (UniqueName: \"kubernetes.io/projected/565c1975-dd8f-418f-87ea-5f836ee42c5b-kube-api-access-klsrs\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029525 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.029533 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/565c1975-dd8f-418f-87ea-5f836ee42c5b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.030427 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.030758 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.031357 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.034665 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.045690 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgcn\" (UniqueName: \"kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn\") pod \"horizon-bf8cbd6d5-wjq5d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.086355 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wdfhd" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" probeResult="failure" output=< Feb 01 06:57:03 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 06:57:03 crc kubenswrapper[4546]: > Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.144639 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.414051 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc6d5f569-qh985" event={"ID":"4b3d5062-dd75-4ae1-b89e-010bfbd99a01","Type":"ContainerStarted","Data":"ab677ceaa12ab9ba24c66d67f972d299a9f64066706e32332cdf03bf2e1ef0f1"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.415979 4546 generic.go:334] "Generic (PLEG): container finished" podID="2bf01534-1b7d-4f23-bc2c-02cb329a2036" containerID="9e86461b8024e892cac94f2fcccea6cdb576941b61c420446695ed6de77ab5c0" exitCode=0 Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.416045 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sscjj" event={"ID":"2bf01534-1b7d-4f23-bc2c-02cb329a2036","Type":"ContainerDied","Data":"9e86461b8024e892cac94f2fcccea6cdb576941b61c420446695ed6de77ab5c0"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.438007 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pgw6x" event={"ID":"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6","Type":"ContainerStarted","Data":"2f875253b9bbda7747a9df25c7280e2629681432a3a4d058e645c2e832b7563c"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.460781 4546 generic.go:334] "Generic (PLEG): container finished" podID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerID="21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead" exitCode=0 Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.460890 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" event={"ID":"33faea48-0805-4a90-90f3-5eaf1bc1c7f3","Type":"ContainerDied","Data":"21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.460932 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" event={"ID":"33faea48-0805-4a90-90f3-5eaf1bc1c7f3","Type":"ContainerStarted","Data":"0b13b7bfe33866b87605991226be53509eaf4d102b6ea5e94cc68fb4c88af17c"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.475987 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qjczq" event={"ID":"7af56bb5-2257-4f2f-97c8-a33236d55b81","Type":"ContainerStarted","Data":"8c6963306ee5a846733476c4d3ca190dbb3c02a097cc0944d89d4f6a1fc5d3d2"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.483481 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cd5px" event={"ID":"e19e5c53-445e-4852-80c6-7bce38282557","Type":"ContainerStarted","Data":"d9ce5a08c153effc0cb36d48340ca8be1974180bcec34eaa605af9177d079ebf"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.483525 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cd5px" event={"ID":"e19e5c53-445e-4852-80c6-7bce38282557","Type":"ContainerStarted","Data":"031332d050d78828b739dbafe5b11964e4a165d843a2222cfd67ba75d31e7033"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.569479 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cd5px" podStartSLOduration=3.569459685 podStartE2EDuration="3.569459685s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:03.551582965 +0000 UTC m=+854.202518981" watchObservedRunningTime="2026-02-01 06:57:03.569459685 +0000 UTC m=+854.220395700" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.581649 4546 generic.go:334] "Generic (PLEG): container finished" podID="565c1975-dd8f-418f-87ea-5f836ee42c5b" containerID="6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d" exitCode=0 Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.581995 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-646574d479-zlxzl" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.582581 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646574d479-zlxzl" event={"ID":"565c1975-dd8f-418f-87ea-5f836ee42c5b","Type":"ContainerDied","Data":"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.582609 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-646574d479-zlxzl" event={"ID":"565c1975-dd8f-418f-87ea-5f836ee42c5b","Type":"ContainerDied","Data":"3ff98d9f6a26eb864931a28e780d19faa85759ef12513867031c7a96539e50ff"} Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.582638 4546 scope.go:117] "RemoveContainer" containerID="6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.698813 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.731412 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.739823 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-646574d479-zlxzl"] Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.757185 4546 scope.go:117] "RemoveContainer" containerID="6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d" Feb 01 06:57:03 crc kubenswrapper[4546]: E0201 06:57:03.758222 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d\": container with ID starting with 6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d not found: ID does not exist" containerID="6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d" Feb 01 06:57:03 crc kubenswrapper[4546]: I0201 06:57:03.758256 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d"} err="failed to get container status \"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d\": rpc error: code = NotFound desc = could not find container \"6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d\": container with ID starting with 6aae9285345a879717e206623dba9c1f7bfd92981a15d7587bf25af798fef31d not found: ID does not exist" Feb 01 06:57:04 crc kubenswrapper[4546]: I0201 06:57:04.601788 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerStarted","Data":"597d0910156a02e5eb678220d345fa56948bd49c473b685d21f88fbc0f779d79"} Feb 01 06:57:04 crc kubenswrapper[4546]: I0201 06:57:04.605706 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" event={"ID":"33faea48-0805-4a90-90f3-5eaf1bc1c7f3","Type":"ContainerStarted","Data":"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315"} Feb 01 06:57:04 crc kubenswrapper[4546]: I0201 06:57:04.606240 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.166691 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sscjj" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.181938 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" podStartSLOduration=5.181922657 podStartE2EDuration="5.181922657s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:04.630719087 +0000 UTC m=+855.281655103" watchObservedRunningTime="2026-02-01 06:57:05.181922657 +0000 UTC m=+855.832858674" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.208826 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle\") pod \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.209508 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data\") pod \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.209562 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data\") pod \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.209592 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdcnd\" (UniqueName: \"kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd\") pod \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\" (UID: \"2bf01534-1b7d-4f23-bc2c-02cb329a2036\") " Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.227139 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2bf01534-1b7d-4f23-bc2c-02cb329a2036" (UID: "2bf01534-1b7d-4f23-bc2c-02cb329a2036"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.236642 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd" (OuterVolumeSpecName: "kube-api-access-zdcnd") pod "2bf01534-1b7d-4f23-bc2c-02cb329a2036" (UID: "2bf01534-1b7d-4f23-bc2c-02cb329a2036"). InnerVolumeSpecName "kube-api-access-zdcnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.241577 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf01534-1b7d-4f23-bc2c-02cb329a2036" (UID: "2bf01534-1b7d-4f23-bc2c-02cb329a2036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.302955 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data" (OuterVolumeSpecName: "config-data") pod "2bf01534-1b7d-4f23-bc2c-02cb329a2036" (UID: "2bf01534-1b7d-4f23-bc2c-02cb329a2036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.312429 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.312454 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.312465 4546 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2bf01534-1b7d-4f23-bc2c-02cb329a2036-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.312474 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdcnd\" (UniqueName: \"kubernetes.io/projected/2bf01534-1b7d-4f23-bc2c-02cb329a2036-kube-api-access-zdcnd\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.388890 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.389177 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.465502 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.572884 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.572948 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.642152 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sscjj" event={"ID":"2bf01534-1b7d-4f23-bc2c-02cb329a2036","Type":"ContainerDied","Data":"3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377"} Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.642220 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4a41b5b057593054cc3acb53df17ed18f2da8fa2875387a8a6de945dc37377" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.642311 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sscjj" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.713207 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565c1975-dd8f-418f-87ea-5f836ee42c5b" path="/var/lib/kubelet/pods/565c1975-dd8f-418f-87ea-5f836ee42c5b/volumes" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.716600 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.796526 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.824658 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:05 crc kubenswrapper[4546]: E0201 06:57:05.825087 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf01534-1b7d-4f23-bc2c-02cb329a2036" containerName="glance-db-sync" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.825101 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf01534-1b7d-4f23-bc2c-02cb329a2036" containerName="glance-db-sync" Feb 01 06:57:05 crc kubenswrapper[4546]: E0201 06:57:05.825143 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565c1975-dd8f-418f-87ea-5f836ee42c5b" containerName="init" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.825148 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="565c1975-dd8f-418f-87ea-5f836ee42c5b" containerName="init" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.825319 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf01534-1b7d-4f23-bc2c-02cb329a2036" containerName="glance-db-sync" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.825332 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="565c1975-dd8f-418f-87ea-5f836ee42c5b" containerName="init" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.826258 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.844277 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.864035 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935525 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935647 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935672 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935705 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935750 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.935825 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psh52\" (UniqueName: \"kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:05 crc kubenswrapper[4546]: I0201 06:57:05.998051 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038266 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psh52\" (UniqueName: \"kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038437 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038561 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038586 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038616 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.038664 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.039393 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.040212 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.040314 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.040501 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.041436 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.088225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psh52\" (UniqueName: \"kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52\") pod \"dnsmasq-dns-b885cfc67-gxrmd\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.180759 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.685026 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.687434 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.692881 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.693125 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ht5t7" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.693221 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.705602 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773284 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmvfl\" (UniqueName: \"kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773344 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773390 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773445 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773482 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773629 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.773665 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.818001 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876265 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876356 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmvfl\" (UniqueName: \"kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876384 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876460 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876493 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876569 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876845 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.876907 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.877192 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.881520 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.882250 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.886798 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.894450 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmvfl\" (UniqueName: \"kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.917576 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.966363 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.967821 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.976309 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 06:57:06 crc kubenswrapper[4546]: I0201 06:57:06.992719 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.039401 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.082841 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.082932 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.083034 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdmz\" (UniqueName: \"kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.083110 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.083148 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.083169 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.083245 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.184994 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdmz\" (UniqueName: \"kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185097 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185131 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185155 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185211 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185261 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185302 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185542 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.185886 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.186336 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.200107 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.209317 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.209704 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.220452 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdmz\" (UniqueName: \"kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.248090 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.312689 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.314060 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.689161 4546 generic.go:334] "Generic (PLEG): container finished" podID="25e08b4c-97bb-43a5-b961-e2191859692d" containerID="a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765" exitCode=0 Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.689291 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" event={"ID":"25e08b4c-97bb-43a5-b961-e2191859692d","Type":"ContainerDied","Data":"a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765"} Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.689321 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" event={"ID":"25e08b4c-97bb-43a5-b961-e2191859692d","Type":"ContainerStarted","Data":"036315f6c4ebb17d9c4f7cae063ca825256a87af396917662d3571615f2112d3"} Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.689473 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="dnsmasq-dns" containerID="cri-o://aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315" gracePeriod=10 Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.689679 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rnnqr" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="registry-server" containerID="cri-o://a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1" gracePeriod=2 Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.706665 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:07 crc kubenswrapper[4546]: W0201 06:57:07.812167 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod950ca6af_02df_47bc_94a4_fd835b800754.slice/crio-b981deb02bc1c8cb44bc1acff8d294db7d6892354e093f40c95a90f027b9fa78 WatchSource:0}: Error finding container b981deb02bc1c8cb44bc1acff8d294db7d6892354e093f40c95a90f027b9fa78: Status 404 returned error can't find the container with id b981deb02bc1c8cb44bc1acff8d294db7d6892354e093f40c95a90f027b9fa78 Feb 01 06:57:07 crc kubenswrapper[4546]: I0201 06:57:07.966477 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.305939 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.306448 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5n264" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" containerID="cri-o://946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" gracePeriod=2 Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.426787 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.428467 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.538876 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkgfh\" (UniqueName: \"kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539136 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content\") pod \"24868290-5ac4-46f3-a91d-2023c92666e6\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539158 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539177 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539230 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539256 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhcz\" (UniqueName: \"kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz\") pod \"24868290-5ac4-46f3-a91d-2023c92666e6\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539275 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539353 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities\") pod \"24868290-5ac4-46f3-a91d-2023c92666e6\" (UID: \"24868290-5ac4-46f3-a91d-2023c92666e6\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.539402 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0\") pod \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\" (UID: \"33faea48-0805-4a90-90f3-5eaf1bc1c7f3\") " Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.540827 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities" (OuterVolumeSpecName: "utilities") pod "24868290-5ac4-46f3-a91d-2023c92666e6" (UID: "24868290-5ac4-46f3-a91d-2023c92666e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.560687 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz" (OuterVolumeSpecName: "kube-api-access-vlhcz") pod "24868290-5ac4-46f3-a91d-2023c92666e6" (UID: "24868290-5ac4-46f3-a91d-2023c92666e6"). InnerVolumeSpecName "kube-api-access-vlhcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.586567 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24868290-5ac4-46f3-a91d-2023c92666e6" (UID: "24868290-5ac4-46f3-a91d-2023c92666e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.592558 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh" (OuterVolumeSpecName: "kube-api-access-bkgfh") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "kube-api-access-bkgfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.621790 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.622247 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.629319 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.629490 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config" (OuterVolumeSpecName: "config") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.632786 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33faea48-0805-4a90-90f3-5eaf1bc1c7f3" (UID: "33faea48-0805-4a90-90f3-5eaf1bc1c7f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642063 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642094 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkgfh\" (UniqueName: \"kubernetes.io/projected/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-kube-api-access-bkgfh\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642108 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642118 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642126 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642135 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642144 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhcz\" (UniqueName: \"kubernetes.io/projected/24868290-5ac4-46f3-a91d-2023c92666e6-kube-api-access-vlhcz\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642152 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33faea48-0805-4a90-90f3-5eaf1bc1c7f3-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.642160 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24868290-5ac4-46f3-a91d-2023c92666e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.740839 4546 generic.go:334] "Generic (PLEG): container finished" podID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerID="aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315" exitCode=0 Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.740938 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.740997 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" event={"ID":"33faea48-0805-4a90-90f3-5eaf1bc1c7f3","Type":"ContainerDied","Data":"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.741062 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f7b9bf65-z52b6" event={"ID":"33faea48-0805-4a90-90f3-5eaf1bc1c7f3","Type":"ContainerDied","Data":"0b13b7bfe33866b87605991226be53509eaf4d102b6ea5e94cc68fb4c88af17c"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.741127 4546 scope.go:117] "RemoveContainer" containerID="aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.753644 4546 generic.go:334] "Generic (PLEG): container finished" podID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" exitCode=0 Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.753688 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerDied","Data":"946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.755542 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"950ca6af-02df-47bc-94a4-fd835b800754","Type":"ContainerStarted","Data":"b981deb02bc1c8cb44bc1acff8d294db7d6892354e093f40c95a90f027b9fa78"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.763825 4546 generic.go:334] "Generic (PLEG): container finished" podID="2d27dc5f-832f-4e8a-aea4-eed121c9e07c" containerID="123f4bf90781e8a1b921df34730c8727ba577ffffeb26b87a686fa6a3fd0d2f9" exitCode=0 Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.763903 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9rq8w" event={"ID":"2d27dc5f-832f-4e8a-aea4-eed121c9e07c","Type":"ContainerDied","Data":"123f4bf90781e8a1b921df34730c8727ba577ffffeb26b87a686fa6a3fd0d2f9"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.765445 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerStarted","Data":"666abc9bc8e09779dae07f48eb07073189b78baeefa04d2e63d68ee7a6604b14"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.768586 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.773424 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7b9bf65-z52b6"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.784290 4546 generic.go:334] "Generic (PLEG): container finished" podID="24868290-5ac4-46f3-a91d-2023c92666e6" containerID="a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1" exitCode=0 Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.784353 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerDied","Data":"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.784381 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnnqr" event={"ID":"24868290-5ac4-46f3-a91d-2023c92666e6","Type":"ContainerDied","Data":"87f2ae2e44b732897454ce7134e537417146d212412fb7ceaa094a722e7bdc3d"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.784452 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnnqr" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.811297 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" event={"ID":"25e08b4c-97bb-43a5-b961-e2191859692d","Type":"ContainerStarted","Data":"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e"} Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.812148 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.837902 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.839259 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnnqr"] Feb 01 06:57:08 crc kubenswrapper[4546]: I0201 06:57:08.854072 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" podStartSLOduration=3.854053405 podStartE2EDuration="3.854053405s" podCreationTimestamp="2026-02-01 06:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:08.848936138 +0000 UTC m=+859.499872154" watchObservedRunningTime="2026-02-01 06:57:08.854053405 +0000 UTC m=+859.504989421" Feb 01 06:57:09 crc kubenswrapper[4546]: I0201 06:57:09.701672 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" path="/var/lib/kubelet/pods/24868290-5ac4-46f3-a91d-2023c92666e6/volumes" Feb 01 06:57:09 crc kubenswrapper[4546]: I0201 06:57:09.718151 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" path="/var/lib/kubelet/pods/33faea48-0805-4a90-90f3-5eaf1bc1c7f3/volumes" Feb 01 06:57:09 crc kubenswrapper[4546]: I0201 06:57:09.828244 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"950ca6af-02df-47bc-94a4-fd835b800754","Type":"ContainerStarted","Data":"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1"} Feb 01 06:57:09 crc kubenswrapper[4546]: I0201 06:57:09.837298 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerStarted","Data":"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012"} Feb 01 06:57:11 crc kubenswrapper[4546]: I0201 06:57:11.856677 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:11 crc kubenswrapper[4546]: I0201 06:57:11.922255 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:12 crc kubenswrapper[4546]: I0201 06:57:12.073660 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:12 crc kubenswrapper[4546]: I0201 06:57:12.166748 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:12 crc kubenswrapper[4546]: I0201 06:57:12.378903 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:12 crc kubenswrapper[4546]: I0201 06:57:12.445754 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:12 crc kubenswrapper[4546]: I0201 06:57:12.714393 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.477401 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.545877 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:57:13 crc kubenswrapper[4546]: E0201 06:57:13.546545 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="registry-server" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546565 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="registry-server" Feb 01 06:57:13 crc kubenswrapper[4546]: E0201 06:57:13.546578 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="dnsmasq-dns" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546584 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="dnsmasq-dns" Feb 01 06:57:13 crc kubenswrapper[4546]: E0201 06:57:13.546600 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="extract-content" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546605 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="extract-content" Feb 01 06:57:13 crc kubenswrapper[4546]: E0201 06:57:13.546617 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="init" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546622 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="init" Feb 01 06:57:13 crc kubenswrapper[4546]: E0201 06:57:13.546629 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="extract-utilities" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546634 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="extract-utilities" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546801 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="33faea48-0805-4a90-90f3-5eaf1bc1c7f3" containerName="dnsmasq-dns" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.546821 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="24868290-5ac4-46f3-a91d-2023c92666e6" containerName="registry-server" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.547648 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.549980 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.572599 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587100 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9j8q\" (UniqueName: \"kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587184 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587248 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587293 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587366 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587431 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.587529 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.637988 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.666912 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5867f5bb44-shmxj"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.668442 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.689763 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.689831 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.689877 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.690075 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.690131 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.690238 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.690265 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9j8q\" (UniqueName: \"kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.693500 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.700751 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.701963 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.707622 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.710465 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9j8q\" (UniqueName: \"kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.710529 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5867f5bb44-shmxj"] Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.719471 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.722539 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle\") pod \"horizon-7c8bd8cd6b-vfjlr\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.793605 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856b2577-3e14-4b6a-9480-9c49b57aad40-logs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.793808 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ljzp\" (UniqueName: \"kubernetes.io/projected/856b2577-3e14-4b6a-9480-9c49b57aad40-kube-api-access-8ljzp\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.793910 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-tls-certs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.793996 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-combined-ca-bundle\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.794182 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-scripts\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.794320 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-secret-key\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.794453 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-config-data\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.875252 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wdfhd" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" containerID="cri-o://9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" gracePeriod=2 Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.878165 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.897833 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ljzp\" (UniqueName: \"kubernetes.io/projected/856b2577-3e14-4b6a-9480-9c49b57aad40-kube-api-access-8ljzp\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.897910 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-tls-certs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.897951 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-combined-ca-bundle\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.897988 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-scripts\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.898053 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-secret-key\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.898086 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-config-data\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.898130 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856b2577-3e14-4b6a-9480-9c49b57aad40-logs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.898955 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/856b2577-3e14-4b6a-9480-9c49b57aad40-logs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.899910 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-scripts\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.905692 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/856b2577-3e14-4b6a-9480-9c49b57aad40-config-data\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.907761 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-combined-ca-bundle\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.910337 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-secret-key\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.915297 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/856b2577-3e14-4b6a-9480-9c49b57aad40-horizon-tls-certs\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.915930 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ljzp\" (UniqueName: \"kubernetes.io/projected/856b2577-3e14-4b6a-9480-9c49b57aad40-kube-api-access-8ljzp\") pod \"horizon-5867f5bb44-shmxj\" (UID: \"856b2577-3e14-4b6a-9480-9c49b57aad40\") " pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:13 crc kubenswrapper[4546]: I0201 06:57:13.987713 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.103890 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.104144 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nmb7h" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" containerID="cri-o://d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" gracePeriod=2 Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.892796 4546 generic.go:334] "Generic (PLEG): container finished" podID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" exitCode=0 Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.892894 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerDied","Data":"d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca"} Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.896663 4546 generic.go:334] "Generic (PLEG): container finished" podID="96495570-944a-41ba-88cb-e251b822c062" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" exitCode=0 Feb 01 06:57:14 crc kubenswrapper[4546]: I0201 06:57:14.896691 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerDied","Data":"9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a"} Feb 01 06:57:15 crc kubenswrapper[4546]: E0201 06:57:15.573728 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:15 crc kubenswrapper[4546]: E0201 06:57:15.574166 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:15 crc kubenswrapper[4546]: E0201 06:57:15.575007 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:15 crc kubenswrapper[4546]: E0201 06:57:15.575039 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-5n264" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:15 crc kubenswrapper[4546]: I0201 06:57:15.911599 4546 generic.go:334] "Generic (PLEG): container finished" podID="e19e5c53-445e-4852-80c6-7bce38282557" containerID="d9ce5a08c153effc0cb36d48340ca8be1974180bcec34eaa605af9177d079ebf" exitCode=0 Feb 01 06:57:15 crc kubenswrapper[4546]: I0201 06:57:15.911659 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cd5px" event={"ID":"e19e5c53-445e-4852-80c6-7bce38282557","Type":"ContainerDied","Data":"d9ce5a08c153effc0cb36d48340ca8be1974180bcec34eaa605af9177d079ebf"} Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.183020 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.292391 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.292748 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" containerID="cri-o://c1c094bd713953a8c2476ed5f01fd66f263e757dbc3cd9d624f3a1677a9fc68e" gracePeriod=10 Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.924139 4546 generic.go:334] "Generic (PLEG): container finished" podID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerID="c1c094bd713953a8c2476ed5f01fd66f263e757dbc3cd9d624f3a1677a9fc68e" exitCode=0 Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.924345 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" event={"ID":"cf952bfa-8c8c-4601-8ea9-f8ac259a7831","Type":"ContainerDied","Data":"c1c094bd713953a8c2476ed5f01fd66f263e757dbc3cd9d624f3a1677a9fc68e"} Feb 01 06:57:16 crc kubenswrapper[4546]: I0201 06:57:16.970712 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.967581 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9rq8w" event={"ID":"2d27dc5f-832f-4e8a-aea4-eed121c9e07c","Type":"ContainerDied","Data":"a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230"} Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.968049 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48eb08d6cbd6f084587ee8bf3f6b2bff35b5c611ed0c66b67b5da8959e64230" Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.970636 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cd5px" event={"ID":"e19e5c53-445e-4852-80c6-7bce38282557","Type":"ContainerDied","Data":"031332d050d78828b739dbafe5b11964e4a165d843a2222cfd67ba75d31e7033"} Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.970666 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031332d050d78828b739dbafe5b11964e4a165d843a2222cfd67ba75d31e7033" Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.979067 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:19 crc kubenswrapper[4546]: I0201 06:57:19.984800 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046330 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046579 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config\") pod \"e19e5c53-445e-4852-80c6-7bce38282557\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046756 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046797 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046830 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046874 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgb58\" (UniqueName: \"kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046896 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5v4h\" (UniqueName: \"kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h\") pod \"e19e5c53-445e-4852-80c6-7bce38282557\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046942 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys\") pod \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\" (UID: \"2d27dc5f-832f-4e8a-aea4-eed121c9e07c\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.046975 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle\") pod \"e19e5c53-445e-4852-80c6-7bce38282557\" (UID: \"e19e5c53-445e-4852-80c6-7bce38282557\") " Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.054790 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.055162 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.060058 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58" (OuterVolumeSpecName: "kube-api-access-fgb58") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "kube-api-access-fgb58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.064981 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h" (OuterVolumeSpecName: "kube-api-access-d5v4h") pod "e19e5c53-445e-4852-80c6-7bce38282557" (UID: "e19e5c53-445e-4852-80c6-7bce38282557"). InnerVolumeSpecName "kube-api-access-d5v4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.066094 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts" (OuterVolumeSpecName: "scripts") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.075220 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config" (OuterVolumeSpecName: "config") pod "e19e5c53-445e-4852-80c6-7bce38282557" (UID: "e19e5c53-445e-4852-80c6-7bce38282557"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.076997 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e19e5c53-445e-4852-80c6-7bce38282557" (UID: "e19e5c53-445e-4852-80c6-7bce38282557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.109457 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.111537 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data" (OuterVolumeSpecName: "config-data") pod "2d27dc5f-832f-4e8a-aea4-eed121c9e07c" (UID: "2d27dc5f-832f-4e8a-aea4-eed121c9e07c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150198 4546 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150231 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150246 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150258 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgb58\" (UniqueName: \"kubernetes.io/projected/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-kube-api-access-fgb58\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150271 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5v4h\" (UniqueName: \"kubernetes.io/projected/e19e5c53-445e-4852-80c6-7bce38282557-kube-api-access-d5v4h\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150282 4546 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150293 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150305 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d27dc5f-832f-4e8a-aea4-eed121c9e07c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.150316 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e19e5c53-445e-4852-80c6-7bce38282557-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.976931 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cd5px" Feb 01 06:57:20 crc kubenswrapper[4546]: I0201 06:57:20.976952 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9rq8w" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.163054 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9rq8w"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.185668 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9rq8w"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.222999 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4wt8z"] Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.228405 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19e5c53-445e-4852-80c6-7bce38282557" containerName="neutron-db-sync" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.228434 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19e5c53-445e-4852-80c6-7bce38282557" containerName="neutron-db-sync" Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.228453 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d27dc5f-832f-4e8a-aea4-eed121c9e07c" containerName="keystone-bootstrap" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.228460 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d27dc5f-832f-4e8a-aea4-eed121c9e07c" containerName="keystone-bootstrap" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.228667 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19e5c53-445e-4852-80c6-7bce38282557" containerName="neutron-db-sync" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.228691 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d27dc5f-832f-4e8a-aea4-eed121c9e07c" containerName="keystone-bootstrap" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.229344 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.231379 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.231442 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.231647 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q48f2" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.231723 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.235570 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.250200 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.252147 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.286953 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4wt8z"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290133 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998bg\" (UniqueName: \"kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290194 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290218 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290238 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290341 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.290452 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.305934 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.393314 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395293 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395393 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395445 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395473 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxm7s\" (UniqueName: \"kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395500 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395537 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395583 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395625 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-998bg\" (UniqueName: \"kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395661 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395689 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395713 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.395736 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.398464 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.398633 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.402080 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.403338 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.403495 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v5ndh" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.408975 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.419346 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-998bg\" (UniqueName: \"kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.498135 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.498715 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.498950 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxm7s\" (UniqueName: \"kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.498993 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499029 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499060 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499104 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499153 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499262 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499274 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.499286 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dct\" (UniqueName: \"kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.500423 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.500452 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.501191 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.501297 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.501335 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.501628 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.505170 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.505461 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.507260 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.507909 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys\") pod \"keystone-bootstrap-4wt8z\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.521339 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxm7s\" (UniqueName: \"kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s\") pod \"dnsmasq-dns-7d668c6fc7-hbl8c\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.547349 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.569549 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.603904 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.603967 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.604049 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dct\" (UniqueName: \"kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.604082 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.604105 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.609163 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.611481 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.612128 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.613603 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.621549 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dct\" (UniqueName: \"kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct\") pod \"neutron-748cdb7884-m5r26\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.676108 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d27dc5f-832f-4e8a-aea4-eed121c9e07c" path="/var/lib/kubelet/pods/2d27dc5f-832f-4e8a-aea4-eed121c9e07c/volumes" Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.806224 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.806675 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.807024 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:21 crc kubenswrapper[4546]: E0201 06:57:21.807054 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-nmb7h" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.876024 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:21 crc kubenswrapper[4546]: I0201 06:57:21.970868 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.010570 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.010910 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.011197 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.011260 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-wdfhd" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.029052 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.029127 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.029273 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f6wvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-qjczq_openstack(7af56bb5-2257-4f2f-97c8-a33236d55b81): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.030611 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-qjczq" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.040852 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.041164 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.041314 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5h696h555h595h5d9h56chd9h85h564h5bbhf6h569h578h67fh85h58chd8h59h644h96h589h5c7hcfh6ch64h5d9h666h556h9bh7bh5bfh66bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2t2wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-fc6d5f569-qh985_openstack(4b3d5062-dd75-4ae1-b89e-010bfbd99a01): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.043773 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-horizon:8a0e02dd0fb8f726038072d0e3af1871\\\"\"]" pod="openstack/horizon-fc6d5f569-qh985" podUID="4b3d5062-dd75-4ae1-b89e-010bfbd99a01" Feb 01 06:57:22 crc kubenswrapper[4546]: E0201 06:57:22.995226 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-placement-api:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/placement-db-sync-qjczq" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.221362 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.222710 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.226476 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.226638 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.236956 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.339933 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v66v\" (UniqueName: \"kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.339977 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.340094 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.340156 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.340271 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.340514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.340562 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442109 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442406 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442440 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442494 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v66v\" (UniqueName: \"kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442515 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442556 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.442591 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.458485 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.458731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.458960 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.458988 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.459072 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.462298 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.469450 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v66v\" (UniqueName: \"kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v\") pod \"neutron-6bbbc47dc7-979jx\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:23 crc kubenswrapper[4546]: I0201 06:57:23.545338 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:25 crc kubenswrapper[4546]: E0201 06:57:25.575244 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:25 crc kubenswrapper[4546]: E0201 06:57:25.577921 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:25 crc kubenswrapper[4546]: E0201 06:57:25.579043 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:25 crc kubenswrapper[4546]: E0201 06:57:25.579073 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-5n264" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:26 crc kubenswrapper[4546]: I0201 06:57:26.971296 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 01 06:57:26 crc kubenswrapper[4546]: I0201 06:57:26.972184 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:57:31 crc kubenswrapper[4546]: E0201 06:57:31.806797 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:31 crc kubenswrapper[4546]: E0201 06:57:31.809309 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:31 crc kubenswrapper[4546]: E0201 06:57:31.809689 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:31 crc kubenswrapper[4546]: E0201 06:57:31.809731 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-nmb7h" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" Feb 01 06:57:31 crc kubenswrapper[4546]: I0201 06:57:31.971258 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 01 06:57:32 crc kubenswrapper[4546]: E0201 06:57:32.011101 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:32 crc kubenswrapper[4546]: E0201 06:57:32.011493 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:32 crc kubenswrapper[4546]: E0201 06:57:32.011999 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:32 crc kubenswrapper[4546]: E0201 06:57:32.012063 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-wdfhd" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" Feb 01 06:57:35 crc kubenswrapper[4546]: E0201 06:57:35.573318 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:35 crc kubenswrapper[4546]: E0201 06:57:35.574773 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:35 crc kubenswrapper[4546]: E0201 06:57:35.575360 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:57:35 crc kubenswrapper[4546]: E0201 06:57:35.575395 4546 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-5n264" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:36 crc kubenswrapper[4546]: I0201 06:57:36.977852 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.424690 4546 scope.go:117] "RemoveContainer" containerID="21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.539317 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.544158 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.545886 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.671953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities\") pod \"96495570-944a-41ba-88cb-e251b822c062\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672150 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8lw\" (UniqueName: \"kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw\") pod \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672198 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs\") pod \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672247 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rsqn\" (UniqueName: \"kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn\") pod \"96495570-944a-41ba-88cb-e251b822c062\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672277 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities\") pod \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672331 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content\") pod \"96495570-944a-41ba-88cb-e251b822c062\" (UID: \"96495570-944a-41ba-88cb-e251b822c062\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672385 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content\") pod \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\" (UID: \"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672415 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data\") pod \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672451 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key\") pod \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672484 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts\") pod \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.672513 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t2wq\" (UniqueName: \"kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq\") pod \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\" (UID: \"4b3d5062-dd75-4ae1-b89e-010bfbd99a01\") " Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.673849 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities" (OuterVolumeSpecName: "utilities") pod "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" (UID: "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.675981 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities" (OuterVolumeSpecName: "utilities") pod "96495570-944a-41ba-88cb-e251b822c062" (UID: "96495570-944a-41ba-88cb-e251b822c062"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.677305 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs" (OuterVolumeSpecName: "logs") pod "4b3d5062-dd75-4ae1-b89e-010bfbd99a01" (UID: "4b3d5062-dd75-4ae1-b89e-010bfbd99a01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.680379 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data" (OuterVolumeSpecName: "config-data") pod "4b3d5062-dd75-4ae1-b89e-010bfbd99a01" (UID: "4b3d5062-dd75-4ae1-b89e-010bfbd99a01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.682827 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts" (OuterVolumeSpecName: "scripts") pod "4b3d5062-dd75-4ae1-b89e-010bfbd99a01" (UID: "4b3d5062-dd75-4ae1-b89e-010bfbd99a01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.683555 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq" (OuterVolumeSpecName: "kube-api-access-2t2wq") pod "4b3d5062-dd75-4ae1-b89e-010bfbd99a01" (UID: "4b3d5062-dd75-4ae1-b89e-010bfbd99a01"). InnerVolumeSpecName "kube-api-access-2t2wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.683807 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn" (OuterVolumeSpecName: "kube-api-access-6rsqn") pod "96495570-944a-41ba-88cb-e251b822c062" (UID: "96495570-944a-41ba-88cb-e251b822c062"). InnerVolumeSpecName "kube-api-access-6rsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.687193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4b3d5062-dd75-4ae1-b89e-010bfbd99a01" (UID: "4b3d5062-dd75-4ae1-b89e-010bfbd99a01"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.687829 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw" (OuterVolumeSpecName: "kube-api-access-nl8lw") pod "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" (UID: "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c"). InnerVolumeSpecName "kube-api-access-nl8lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.748931 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96495570-944a-41ba-88cb-e251b822c062" (UID: "96495570-944a-41ba-88cb-e251b822c062"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.763316 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" (UID: "6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776084 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776110 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776121 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776130 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776137 4546 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776145 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776153 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t2wq\" (UniqueName: \"kubernetes.io/projected/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-kube-api-access-2t2wq\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776160 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96495570-944a-41ba-88cb-e251b822c062-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776170 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8lw\" (UniqueName: \"kubernetes.io/projected/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c-kube-api-access-nl8lw\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776177 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b3d5062-dd75-4ae1-b89e-010bfbd99a01-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:37 crc kubenswrapper[4546]: I0201 06:57:37.776186 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rsqn\" (UniqueName: \"kubernetes.io/projected/96495570-944a-41ba-88cb-e251b822c062-kube-api-access-6rsqn\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.068551 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.069400 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.069575 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cp4g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-6ktch_openstack(8b4a2956-c177-42f3-8981-830dbac77943): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.071083 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-6ktch" podUID="8b4a2956-c177-42f3-8981-830dbac77943" Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.171703 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fc6d5f569-qh985" event={"ID":"4b3d5062-dd75-4ae1-b89e-010bfbd99a01","Type":"ContainerDied","Data":"ab677ceaa12ab9ba24c66d67f972d299a9f64066706e32332cdf03bf2e1ef0f1"} Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.171978 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fc6d5f569-qh985" Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.177887 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5n264" event={"ID":"6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c","Type":"ContainerDied","Data":"1fda9466eeb7423cd7c1b5514e6212c6f978cc263f7f7ce5387090f1f295bf7f"} Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.178022 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5n264" Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.187133 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdfhd" Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.187548 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdfhd" event={"ID":"96495570-944a-41ba-88cb-e251b822c062","Type":"ContainerDied","Data":"86b4306a967b6114576efaf2b4e3043b38719535ad94934d2320e2cf30fb7b6b"} Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.193002 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-heat-engine:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/heat-db-sync-6ktch" podUID="8b4a2956-c177-42f3-8981-830dbac77943" Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.276101 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.282452 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wdfhd"] Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.315949 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.324724 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fc6d5f569-qh985"] Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.340924 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:57:38 crc kubenswrapper[4546]: I0201 06:57:38.344299 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5n264"] Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.716931 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.717248 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.717389 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f28gf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-pgw6x_openstack(91d86af3-9b64-4ebd-ac39-e2063ea7c9b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:57:38 crc kubenswrapper[4546]: E0201 06:57:38.718926 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-pgw6x" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" Feb 01 06:57:39 crc kubenswrapper[4546]: E0201 06:57:39.203763 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-barbican-api:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/barbican-db-sync-pgw6x" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" Feb 01 06:57:39 crc kubenswrapper[4546]: I0201 06:57:39.665345 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3d5062-dd75-4ae1-b89e-010bfbd99a01" path="/var/lib/kubelet/pods/4b3d5062-dd75-4ae1-b89e-010bfbd99a01/volumes" Feb 01 06:57:39 crc kubenswrapper[4546]: I0201 06:57:39.665848 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" path="/var/lib/kubelet/pods/6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c/volumes" Feb 01 06:57:39 crc kubenswrapper[4546]: I0201 06:57:39.666551 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96495570-944a-41ba-88cb-e251b822c062" path="/var/lib/kubelet/pods/96495570-944a-41ba-88cb-e251b822c062/volumes" Feb 01 06:57:39 crc kubenswrapper[4546]: E0201 06:57:39.914724 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:39 crc kubenswrapper[4546]: E0201 06:57:39.914818 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 06:57:39 crc kubenswrapper[4546]: E0201 06:57:39.915250 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvvqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b9btc_openstack(59c89483-60db-4db0-8957-32962d2a73b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 06:57:39 crc kubenswrapper[4546]: E0201 06:57:39.917296 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b9btc" podUID="59c89483-60db-4db0-8957-32962d2a73b1" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.019040 4546 scope.go:117] "RemoveContainer" containerID="aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.019764 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315\": container with ID starting with aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315 not found: ID does not exist" containerID="aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.019819 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315"} err="failed to get container status \"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315\": rpc error: code = NotFound desc = could not find container \"aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315\": container with ID starting with aaf7d2c379e8e19011af95e0c79c721b09a2658cb8ac951ba3501063ff126315 not found: ID does not exist" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.019851 4546 scope.go:117] "RemoveContainer" containerID="21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.020159 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead\": container with ID starting with 21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead not found: ID does not exist" containerID="21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.020187 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead"} err="failed to get container status \"21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead\": rpc error: code = NotFound desc = could not find container \"21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead\": container with ID starting with 21667e7d3711cfc6cd96d0b88cf1c6dc660acdd3fdec1bc59607d1fc97cceead not found: ID does not exist" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.020206 4546 scope.go:117] "RemoveContainer" containerID="a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.254962 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" event={"ID":"cf952bfa-8c8c-4601-8ea9-f8ac259a7831","Type":"ContainerDied","Data":"6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba"} Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.255803 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb42d0917f6452dde020caa6a6b220bacf886e453e65e0e24dcf64fb6b0aeba" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.258986 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.262196 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmb7h" event={"ID":"c07666c5-454b-4d29-8574-bfda5f24b39d","Type":"ContainerDied","Data":"e845270320bcc47eae8baf5a276ed391b7aab05be92a676a95d0f6d5c208efe7"} Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.263728 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e845270320bcc47eae8baf5a276ed391b7aab05be92a676a95d0f6d5c208efe7" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.271325 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/cinder-db-sync-b9btc" podUID="59c89483-60db-4db0-8957-32962d2a73b1" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.273247 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.354107 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content\") pod \"c07666c5-454b-4d29-8574-bfda5f24b39d\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.360069 4546 scope.go:117] "RemoveContainer" containerID="3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.365479 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.366388 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.370156 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.370247 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.370327 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ghw2\" (UniqueName: \"kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2\") pod \"c07666c5-454b-4d29-8574-bfda5f24b39d\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.370353 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.372111 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities\") pod \"c07666c5-454b-4d29-8574-bfda5f24b39d\" (UID: \"c07666c5-454b-4d29-8574-bfda5f24b39d\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.372225 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6jln\" (UniqueName: \"kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln\") pod \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\" (UID: \"cf952bfa-8c8c-4601-8ea9-f8ac259a7831\") " Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.385449 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities" (OuterVolumeSpecName: "utilities") pod "c07666c5-454b-4d29-8574-bfda5f24b39d" (UID: "c07666c5-454b-4d29-8574-bfda5f24b39d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.403385 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2" (OuterVolumeSpecName: "kube-api-access-9ghw2") pod "c07666c5-454b-4d29-8574-bfda5f24b39d" (UID: "c07666c5-454b-4d29-8574-bfda5f24b39d"). InnerVolumeSpecName "kube-api-access-9ghw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.407919 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln" (OuterVolumeSpecName: "kube-api-access-b6jln") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "kube-api-access-b6jln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.410733 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c07666c5-454b-4d29-8574-bfda5f24b39d" (UID: "c07666c5-454b-4d29-8574-bfda5f24b39d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.454406 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.475826 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6jln\" (UniqueName: \"kubernetes.io/projected/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-kube-api-access-b6jln\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.475844 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.475865 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.475874 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ghw2\" (UniqueName: \"kubernetes.io/projected/c07666c5-454b-4d29-8574-bfda5f24b39d-kube-api-access-9ghw2\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.475883 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c07666c5-454b-4d29-8574-bfda5f24b39d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.476034 4546 scope.go:117] "RemoveContainer" containerID="b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.526586 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.547931 4546 scope.go:117] "RemoveContainer" containerID="a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.551753 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1\": container with ID starting with a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1 not found: ID does not exist" containerID="a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.551788 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1"} err="failed to get container status \"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1\": rpc error: code = NotFound desc = could not find container \"a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1\": container with ID starting with a1b4d93e9dae204340ddd98732a6729ad5462bf1b90b70c3bdbd3915e78adcd1 not found: ID does not exist" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.551814 4546 scope.go:117] "RemoveContainer" containerID="3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.552400 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e\": container with ID starting with 3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e not found: ID does not exist" containerID="3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.552440 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e"} err="failed to get container status \"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e\": rpc error: code = NotFound desc = could not find container \"3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e\": container with ID starting with 3a0cf08d7ae08433efa21f52c32a9d69113bca69a53bf3fbebfce326a5a1504e not found: ID does not exist" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.552467 4546 scope.go:117] "RemoveContainer" containerID="b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f" Feb 01 06:57:40 crc kubenswrapper[4546]: E0201 06:57:40.555743 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f\": container with ID starting with b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f not found: ID does not exist" containerID="b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.555777 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f"} err="failed to get container status \"b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f\": rpc error: code = NotFound desc = could not find container \"b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f\": container with ID starting with b74f9f4f5b3bb4363379a800f7faac82e99cfb30e72dc413bcc79b4949c1f37f not found: ID does not exist" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.555795 4546 scope.go:117] "RemoveContainer" containerID="946b09294132839098ad12456b5d336b2967e4a52f5242d4de574986f78c3426" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.577893 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.579178 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.585851 4546 scope.go:117] "RemoveContainer" containerID="9b0e71c07f08fb962496de3dfe84f84792b33a4ca949f820e59860e2d09e1d8f" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.605270 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config" (OuterVolumeSpecName: "config") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.606435 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf952bfa-8c8c-4601-8ea9-f8ac259a7831" (UID: "cf952bfa-8c8c-4601-8ea9-f8ac259a7831"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.691592 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.691892 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.691904 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf952bfa-8c8c-4601-8ea9-f8ac259a7831-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.697286 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5867f5bb44-shmxj"] Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.702013 4546 scope.go:117] "RemoveContainer" containerID="8f09abbe04bbc350aa0c372d9fc629b50cb2b8b57ec024a81c2c914d060c42ff" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.716626 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:57:40 crc kubenswrapper[4546]: W0201 06:57:40.716728 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod856b2577_3e14_4b6a_9480_9c49b57aad40.slice/crio-7f349ee19576a4a4130ea3a91a3e862b38169b8b5989f95d0025c24add32a78e WatchSource:0}: Error finding container 7f349ee19576a4a4130ea3a91a3e862b38169b8b5989f95d0025c24add32a78e: Status 404 returned error can't find the container with id 7f349ee19576a4a4130ea3a91a3e862b38169b8b5989f95d0025c24add32a78e Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.800416 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4wt8z"] Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.800990 4546 scope.go:117] "RemoveContainer" containerID="9b905fe0735f9a8d38d02dd6ca769e80961ee070af0918b46bc3f398a495873a" Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.886483 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:57:40 crc kubenswrapper[4546]: I0201 06:57:40.962200 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.030448 4546 scope.go:117] "RemoveContainer" containerID="fce5c0bd3b9c03c122cf7625a7efd9e32da5c57ee952db510965ae7dac15d9cc" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.130189 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:57:41 crc kubenswrapper[4546]: W0201 06:57:41.134794 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ac113d_2149_47d8_8a13_a864cdeff3ee.slice/crio-51701b808b3db4b9686dda0998722c5f4919552455165cc265ae46a1ef4b3692 WatchSource:0}: Error finding container 51701b808b3db4b9686dda0998722c5f4919552455165cc265ae46a1ef4b3692: Status 404 returned error can't find the container with id 51701b808b3db4b9686dda0998722c5f4919552455165cc265ae46a1ef4b3692 Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.262182 4546 scope.go:117] "RemoveContainer" containerID="8085beef3a5ccfc945432735f3293c03b0145982c5d29519394b0a078eff7e71" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.308717 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerStarted","Data":"51701b808b3db4b9686dda0998722c5f4919552455165cc265ae46a1ef4b3692"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.317085 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-log" containerID="cri-o://10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" gracePeriod=30 Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.317417 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-httpd" containerID="cri-o://87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" gracePeriod=30 Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.330836 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerStarted","Data":"1dc96d1f38420507484550dc5fea604ec7287d0ef1855005f7600d236c468b7c"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.330904 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerStarted","Data":"f632170be0c9e01aec485bb30ecfa4234b4d9ca9d3f5f2c42f7a045df77cc580"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.345060 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5867f5bb44-shmxj" event={"ID":"856b2577-3e14-4b6a-9480-9c49b57aad40","Type":"ContainerStarted","Data":"7f349ee19576a4a4130ea3a91a3e862b38169b8b5989f95d0025c24add32a78e"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.359802 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=36.359768091 podStartE2EDuration="36.359768091s" podCreationTimestamp="2026-02-01 06:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:41.344437151 +0000 UTC m=+891.995373168" watchObservedRunningTime="2026-02-01 06:57:41.359768091 +0000 UTC m=+892.010704108" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.361820 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qjczq" event={"ID":"7af56bb5-2257-4f2f-97c8-a33236d55b81","Type":"ContainerStarted","Data":"ca676bfa1fe391f87550448426c1dbc286f9722ad540f53698167426dc53b6b8"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.367940 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wt8z" event={"ID":"156aa66f-373e-4f1f-bcb5-4a764235a839","Type":"ContainerStarted","Data":"d11ca99e425bd95ce6f1bf53a9c50bd9e556a1f9c1d390787346cabca6e3b3ef"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.369016 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerStarted","Data":"5d4f3f345f84654584edb69659d9d0057a5004251db3715ade660b1cc289aaed"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.370597 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerStarted","Data":"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.370801 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bf8cbd6d5-wjq5d" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon-log" containerID="cri-o://d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" gracePeriod=30 Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.371167 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-bf8cbd6d5-wjq5d" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon" containerID="cri-o://dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" gracePeriod=30 Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.375653 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerStarted","Data":"07590c57da60555fe686858a2df6c9fc569ea928439e69fe7aecfb572f0003eb"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.376648 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-qjczq" podStartSLOduration=3.775401675 podStartE2EDuration="41.376637812s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:02.377793062 +0000 UTC m=+853.028729079" lastFinishedPulling="2026-02-01 06:57:39.9790292 +0000 UTC m=+890.629965216" observedRunningTime="2026-02-01 06:57:41.373084033 +0000 UTC m=+892.024020039" watchObservedRunningTime="2026-02-01 06:57:41.376637812 +0000 UTC m=+892.027573818" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.391641 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" event={"ID":"b364bd0d-fc72-4625-aba3-67afb7c32703","Type":"ContainerStarted","Data":"bb8d8edaf2bc30360d63def4a52318e90c684a147c38ca35e5865b8e619fe381"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.393085 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmb7h" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.399677 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerStarted","Data":"ffcaf1e1ba2cc34592de70c4e17dd95cec56248aaa3690ae830c9e46caaf5127"} Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.401070 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bb7bbd45-zmgsm" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.445479 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bf8cbd6d5-wjq5d" podStartSLOduration=3.2755240629999998 podStartE2EDuration="39.445469506s" podCreationTimestamp="2026-02-01 06:57:02 +0000 UTC" firstStartedPulling="2026-02-01 06:57:03.775702352 +0000 UTC m=+854.426638367" lastFinishedPulling="2026-02-01 06:57:39.945647794 +0000 UTC m=+890.596583810" observedRunningTime="2026-02-01 06:57:41.4051551 +0000 UTC m=+892.056091107" watchObservedRunningTime="2026-02-01 06:57:41.445469506 +0000 UTC m=+892.096405522" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.446709 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.456598 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nmb7h"] Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.465247 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.473749 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9bb7bbd45-zmgsm"] Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.668874 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" path="/var/lib/kubelet/pods/c07666c5-454b-4d29-8574-bfda5f24b39d/volumes" Feb 01 06:57:41 crc kubenswrapper[4546]: I0201 06:57:41.671532 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" path="/var/lib/kubelet/pods/cf952bfa-8c8c-4601-8ea9-f8ac259a7831/volumes" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.270737 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353585 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353644 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmvfl\" (UniqueName: \"kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353733 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353755 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353833 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353909 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.353948 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle\") pod \"950ca6af-02df-47bc-94a4-fd835b800754\" (UID: \"950ca6af-02df-47bc-94a4-fd835b800754\") " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.359447 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs" (OuterVolumeSpecName: "logs") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.360040 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.370193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.384070 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts" (OuterVolumeSpecName: "scripts") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.384183 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl" (OuterVolumeSpecName: "kube-api-access-qmvfl") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "kube-api-access-qmvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.401251 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.429254 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerStarted","Data":"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.429461 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-log" containerID="cri-o://1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" gracePeriod=30 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.430083 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-httpd" containerID="cri-o://b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" gracePeriod=30 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.443360 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerStarted","Data":"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.443416 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerStarted","Data":"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.443656 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.450362 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerStarted","Data":"b214ed0d0bf3c06ce016b363c648a64db0bbe04f0f3440156b319469962cf096"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.450488 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55cb447c8f-m8jw2" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon-log" containerID="cri-o://ffcaf1e1ba2cc34592de70c4e17dd95cec56248aaa3690ae830c9e46caaf5127" gracePeriod=30 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.450769 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-55cb447c8f-m8jw2" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon" containerID="cri-o://b214ed0d0bf3c06ce016b363c648a64db0bbe04f0f3440156b319469962cf096" gracePeriod=30 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.454942 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data" (OuterVolumeSpecName: "config-data") pod "950ca6af-02df-47bc-94a4-fd835b800754" (UID: "950ca6af-02df-47bc-94a4-fd835b800754"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.456091 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459096 4546 generic.go:334] "Generic (PLEG): container finished" podID="950ca6af-02df-47bc-94a4-fd835b800754" containerID="87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" exitCode=143 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459122 4546 generic.go:334] "Generic (PLEG): container finished" podID="950ca6af-02df-47bc-94a4-fd835b800754" containerID="10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" exitCode=143 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459193 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459247 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/950ca6af-02df-47bc-94a4-fd835b800754-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459304 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459358 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459412 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmvfl\" (UniqueName: \"kubernetes.io/projected/950ca6af-02df-47bc-94a4-fd835b800754-kube-api-access-qmvfl\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.456871 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerStarted","Data":"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459523 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"950ca6af-02df-47bc-94a4-fd835b800754","Type":"ContainerDied","Data":"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.461166 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"950ca6af-02df-47bc-94a4-fd835b800754","Type":"ContainerDied","Data":"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.461291 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"950ca6af-02df-47bc-94a4-fd835b800754","Type":"ContainerDied","Data":"b981deb02bc1c8cb44bc1acff8d294db7d6892354e093f40c95a90f027b9fa78"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.461120 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=37.461110364 podStartE2EDuration="37.461110364s" podCreationTimestamp="2026-02-01 06:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.454476639 +0000 UTC m=+893.105412655" watchObservedRunningTime="2026-02-01 06:57:42.461110364 +0000 UTC m=+893.112046380" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.459200 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.463023 4546 scope.go:117] "RemoveContainer" containerID="87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.471977 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerStarted","Data":"7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.476238 4546 generic.go:334] "Generic (PLEG): container finished" podID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerID="b17c73d0b08382beb80f681a74a66406b90fb7830c92248557b2cf2134336c4d" exitCode=0 Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.476296 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" event={"ID":"b364bd0d-fc72-4625-aba3-67afb7c32703","Type":"ContainerDied","Data":"b17c73d0b08382beb80f681a74a66406b90fb7830c92248557b2cf2134336c4d"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.489722 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerStarted","Data":"1dac26e923ee4b658c5bf75a6ffa320740158613f7cf7c7525307a3083e6a354"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.489811 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerStarted","Data":"34f1d49f70e8071b576af6dd3ea1f0d6c2447d06e3d60ee0507b0b60fad1400d"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.490427 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.491398 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.498202 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wt8z" event={"ID":"156aa66f-373e-4f1f-bcb5-4a764235a839","Type":"ContainerStarted","Data":"d26c1a3c7b7135a987f7d5a19835eccee9bed2582a192dbe74791bf6131eec26"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.504880 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-748cdb7884-m5r26" podStartSLOduration=21.504850899 podStartE2EDuration="21.504850899s" podCreationTimestamp="2026-02-01 06:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.500578816 +0000 UTC m=+893.151514832" watchObservedRunningTime="2026-02-01 06:57:42.504850899 +0000 UTC m=+893.155786915" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.505671 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55cb447c8f-m8jw2" podStartSLOduration=6.58699021 podStartE2EDuration="42.505665955s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:02.138371892 +0000 UTC m=+852.789307908" lastFinishedPulling="2026-02-01 06:57:38.057047637 +0000 UTC m=+888.707983653" observedRunningTime="2026-02-01 06:57:42.482256847 +0000 UTC m=+893.133192863" watchObservedRunningTime="2026-02-01 06:57:42.505665955 +0000 UTC m=+893.156601962" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.506442 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5867f5bb44-shmxj" event={"ID":"856b2577-3e14-4b6a-9480-9c49b57aad40","Type":"ContainerStarted","Data":"f841b9fb5be44670a8d71f2ff954aac574a3cb46db8d25ec8d4925d33532cfaf"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.506476 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5867f5bb44-shmxj" event={"ID":"856b2577-3e14-4b6a-9480-9c49b57aad40","Type":"ContainerStarted","Data":"eb4e4de6743bee9bce4070e027c8eed7f6a140e89af6af0494e39e4b133e1bb6"} Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.513108 4546 scope.go:117] "RemoveContainer" containerID="10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.561097 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/950ca6af-02df-47bc-94a4-fd835b800754-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.563391 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.568414 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podStartSLOduration=29.568404854 podStartE2EDuration="29.568404854s" podCreationTimestamp="2026-02-01 06:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.542931634 +0000 UTC m=+893.193867650" watchObservedRunningTime="2026-02-01 06:57:42.568404854 +0000 UTC m=+893.219340860" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.572467 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bbbc47dc7-979jx" podStartSLOduration=19.572459077 podStartE2EDuration="19.572459077s" podCreationTimestamp="2026-02-01 06:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.571911745 +0000 UTC m=+893.222847761" watchObservedRunningTime="2026-02-01 06:57:42.572459077 +0000 UTC m=+893.223395092" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.577528 4546 scope.go:117] "RemoveContainer" containerID="87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.585825 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c\": container with ID starting with 87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c not found: ID does not exist" containerID="87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.585951 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c"} err="failed to get container status \"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c\": rpc error: code = NotFound desc = could not find container \"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c\": container with ID starting with 87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c not found: ID does not exist" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.586040 4546 scope.go:117] "RemoveContainer" containerID="10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.590250 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1\": container with ID starting with 10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1 not found: ID does not exist" containerID="10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.590376 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1"} err="failed to get container status \"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1\": rpc error: code = NotFound desc = could not find container \"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1\": container with ID starting with 10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1 not found: ID does not exist" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.590441 4546 scope.go:117] "RemoveContainer" containerID="87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.595036 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c"} err="failed to get container status \"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c\": rpc error: code = NotFound desc = could not find container \"87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c\": container with ID starting with 87b1150687140498d804bcd7e0445f45630a0a1fef6e1889a0fd077d51b4187c not found: ID does not exist" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.595134 4546 scope.go:117] "RemoveContainer" containerID="10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.600881 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1"} err="failed to get container status \"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1\": rpc error: code = NotFound desc = could not find container \"10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1\": container with ID starting with 10993cbfdde2fc9d2ccad2cb4a7b022fbcf5be5eedb3bf536a0d3b1b57512fb1 not found: ID does not exist" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.624448 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.633888 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.639076 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4wt8z" podStartSLOduration=21.639063151 podStartE2EDuration="21.639063151s" podCreationTimestamp="2026-02-01 06:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.613843219 +0000 UTC m=+893.264779236" watchObservedRunningTime="2026-02-01 06:57:42.639063151 +0000 UTC m=+893.289999167" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.652558 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.652953 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="init" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653012 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="init" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653060 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653113 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653159 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653199 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653247 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653287 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653332 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653375 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653425 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653464 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653511 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653559 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653609 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-httpd" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653663 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-httpd" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653706 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653744 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653785 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653828 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.653884 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.653959 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="extract-content" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.654011 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-log" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654052 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-log" Feb 01 06:57:42 crc kubenswrapper[4546]: E0201 06:57:42.654094 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654141 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="extract-utilities" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654368 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="96495570-944a-41ba-88cb-e251b822c062" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654429 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07666c5-454b-4d29-8574-bfda5f24b39d" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654479 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-httpd" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654529 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf952bfa-8c8c-4601-8ea9-f8ac259a7831" containerName="dnsmasq-dns" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654584 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="950ca6af-02df-47bc-94a4-fd835b800754" containerName="glance-log" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.654633 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a778d80-9088-4ea7-82fc-8c2ff4e0ba9c" containerName="registry-server" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.656255 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.661773 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5867f5bb44-shmxj" podStartSLOduration=29.661764465 podStartE2EDuration="29.661764465s" podCreationTimestamp="2026-02-01 06:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:42.633617887 +0000 UTC m=+893.284553892" watchObservedRunningTime="2026-02-01 06:57:42.661764465 +0000 UTC m=+893.312700481" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.663091 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.663310 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.680175 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.780319 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.783184 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.785034 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.785160 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.785261 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.785817 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.785983 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54d5j\" (UniqueName: \"kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.786107 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.887844 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.887909 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.887946 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.887965 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.887981 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.888027 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.888057 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54d5j\" (UniqueName: \"kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.888088 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.888468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.890019 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.890553 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.900314 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.900889 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.901490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.916273 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.923884 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:42 crc kubenswrapper[4546]: I0201 06:57:42.932595 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54d5j\" (UniqueName: \"kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j\") pod \"glance-default-external-api-0\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " pod="openstack/glance-default-external-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.088207 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.145763 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.324994 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401487 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401539 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401666 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401683 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401734 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdmz\" (UniqueName: \"kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401752 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.401772 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle\") pod \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\" (UID: \"efd611b6-60b5-4a31-a9ee-9c519ee89de3\") " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.403065 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs" (OuterVolumeSpecName: "logs") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.404165 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.412122 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz" (OuterVolumeSpecName: "kube-api-access-rjdmz") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "kube-api-access-rjdmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.435190 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.435296 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts" (OuterVolumeSpecName: "scripts") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.454463 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.485051 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data" (OuterVolumeSpecName: "config-data") pod "efd611b6-60b5-4a31-a9ee-9c519ee89de3" (UID: "efd611b6-60b5-4a31-a9ee-9c519ee89de3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503149 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503207 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503219 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503230 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdmz\" (UniqueName: \"kubernetes.io/projected/efd611b6-60b5-4a31-a9ee-9c519ee89de3-kube-api-access-rjdmz\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503243 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efd611b6-60b5-4a31-a9ee-9c519ee89de3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503269 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd611b6-60b5-4a31-a9ee-9c519ee89de3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.503301 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.536698 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548827 4546 generic.go:334] "Generic (PLEG): container finished" podID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerID="b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" exitCode=0 Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548873 4546 generic.go:334] "Generic (PLEG): container finished" podID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerID="1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" exitCode=143 Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548916 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerDied","Data":"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea"} Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548945 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerDied","Data":"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012"} Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548956 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"efd611b6-60b5-4a31-a9ee-9c519ee89de3","Type":"ContainerDied","Data":"666abc9bc8e09779dae07f48eb07073189b78baeefa04d2e63d68ee7a6604b14"} Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.548971 4546 scope.go:117] "RemoveContainer" containerID="b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.549073 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.554777 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" event={"ID":"b364bd0d-fc72-4625-aba3-67afb7c32703","Type":"ContainerStarted","Data":"00c11f97b4794948dfcc9be71ace0f59d2478f36bb3b9a3006ad675564daddac"} Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.555475 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.594930 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" podStartSLOduration=22.594907356 podStartE2EDuration="22.594907356s" podCreationTimestamp="2026-02-01 06:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:43.582359913 +0000 UTC m=+894.233295919" watchObservedRunningTime="2026-02-01 06:57:43.594907356 +0000 UTC m=+894.245843393" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.604909 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.605868 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.610367 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.648345 4546 scope.go:117] "RemoveContainer" containerID="1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.698327 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="950ca6af-02df-47bc-94a4-fd835b800754" path="/var/lib/kubelet/pods/950ca6af-02df-47bc-94a4-fd835b800754/volumes" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.699097 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" path="/var/lib/kubelet/pods/efd611b6-60b5-4a31-a9ee-9c519ee89de3/volumes" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.699616 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:43 crc kubenswrapper[4546]: E0201 06:57:43.699968 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-log" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.699982 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-log" Feb 01 06:57:43 crc kubenswrapper[4546]: E0201 06:57:43.700001 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-httpd" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.700007 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-httpd" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.700187 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-httpd" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.700204 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd611b6-60b5-4a31-a9ee-9c519ee89de3" containerName="glance-log" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.702276 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.702357 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.704758 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.704959 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.728302 4546 scope.go:117] "RemoveContainer" containerID="b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" Feb 01 06:57:43 crc kubenswrapper[4546]: E0201 06:57:43.730727 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea\": container with ID starting with b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea not found: ID does not exist" containerID="b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.730759 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea"} err="failed to get container status \"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea\": rpc error: code = NotFound desc = could not find container \"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea\": container with ID starting with b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea not found: ID does not exist" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.730782 4546 scope.go:117] "RemoveContainer" containerID="1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" Feb 01 06:57:43 crc kubenswrapper[4546]: E0201 06:57:43.731991 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012\": container with ID starting with 1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012 not found: ID does not exist" containerID="1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.732011 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012"} err="failed to get container status \"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012\": rpc error: code = NotFound desc = could not find container \"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012\": container with ID starting with 1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012 not found: ID does not exist" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.732024 4546 scope.go:117] "RemoveContainer" containerID="b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.736377 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea"} err="failed to get container status \"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea\": rpc error: code = NotFound desc = could not find container \"b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea\": container with ID starting with b0cd8ea246f9ea20e2658da4796f172a624bafcdaa6c51c6851fd91b67ba1eea not found: ID does not exist" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.736402 4546 scope.go:117] "RemoveContainer" containerID="1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.736743 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012"} err="failed to get container status \"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012\": rpc error: code = NotFound desc = could not find container \"1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012\": container with ID starting with 1fec4881994717aa0109106384d7a94aaca7a5460883ee998814adb4f45ae012 not found: ID does not exist" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808345 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808425 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808475 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808512 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808531 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5852\" (UniqueName: \"kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808560 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808578 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.808918 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.849902 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:57:43 crc kubenswrapper[4546]: W0201 06:57:43.868001 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8467d399_3ecc_4cb7_83ab_d285f8cdf7de.slice/crio-5e8a0e938938b72764834474a3c17268e5f6b604915fac9dd0578d698071ff70 WatchSource:0}: Error finding container 5e8a0e938938b72764834474a3c17268e5f6b604915fac9dd0578d698071ff70: Status 404 returned error can't find the container with id 5e8a0e938938b72764834474a3c17268e5f6b604915fac9dd0578d698071ff70 Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.879193 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.879241 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910132 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910192 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910235 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910256 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910281 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910298 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5852\" (UniqueName: \"kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910313 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910328 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.910684 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.911494 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.913095 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.918020 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.919526 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.920526 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.921367 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.939368 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5852\" (UniqueName: \"kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.988103 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.988172 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:57:43 crc kubenswrapper[4546]: I0201 06:57:43.993089 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:57:44 crc kubenswrapper[4546]: I0201 06:57:44.027746 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:44 crc kubenswrapper[4546]: I0201 06:57:44.569301 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerStarted","Data":"5e8a0e938938b72764834474a3c17268e5f6b604915fac9dd0578d698071ff70"} Feb 01 06:57:44 crc kubenswrapper[4546]: I0201 06:57:44.677086 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:57:44 crc kubenswrapper[4546]: W0201 06:57:44.690287 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff7f7e42_647e_4e25_a3a8_32c23eeb9277.slice/crio-46156c13f831e3b09731f5fa785c1b941464337a17d4ac2efff54d4466769ee8 WatchSource:0}: Error finding container 46156c13f831e3b09731f5fa785c1b941464337a17d4ac2efff54d4466769ee8: Status 404 returned error can't find the container with id 46156c13f831e3b09731f5fa785c1b941464337a17d4ac2efff54d4466769ee8 Feb 01 06:57:45 crc kubenswrapper[4546]: I0201 06:57:45.579732 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerStarted","Data":"46156c13f831e3b09731f5fa785c1b941464337a17d4ac2efff54d4466769ee8"} Feb 01 06:57:45 crc kubenswrapper[4546]: I0201 06:57:45.582599 4546 generic.go:334] "Generic (PLEG): container finished" podID="7af56bb5-2257-4f2f-97c8-a33236d55b81" containerID="ca676bfa1fe391f87550448426c1dbc286f9722ad540f53698167426dc53b6b8" exitCode=0 Feb 01 06:57:45 crc kubenswrapper[4546]: I0201 06:57:45.582656 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qjczq" event={"ID":"7af56bb5-2257-4f2f-97c8-a33236d55b81","Type":"ContainerDied","Data":"ca676bfa1fe391f87550448426c1dbc286f9722ad540f53698167426dc53b6b8"} Feb 01 06:57:45 crc kubenswrapper[4546]: I0201 06:57:45.584218 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerStarted","Data":"1d756fb963091c578fde353b62a409395523aa03978879c60b95208a39b88ca7"} Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.593441 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerStarted","Data":"0f49c519ff4bdf72daa941545c8c3591d9ee81d4959f9b934c257411417daebc"} Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.596967 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerStarted","Data":"4a5f15bd1d7835c016f46c2f196f9d2d2ae66c2104844c833cfa3d78a502e4a4"} Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.598621 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerStarted","Data":"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d"} Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.598650 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerStarted","Data":"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38"} Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.624239 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.624220162 podStartE2EDuration="4.624220162s" podCreationTimestamp="2026-02-01 06:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:46.615512196 +0000 UTC m=+897.266448212" watchObservedRunningTime="2026-02-01 06:57:46.624220162 +0000 UTC m=+897.275156178" Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.652247 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.65222852 podStartE2EDuration="3.65222852s" podCreationTimestamp="2026-02-01 06:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:46.639669385 +0000 UTC m=+897.290605400" watchObservedRunningTime="2026-02-01 06:57:46.65222852 +0000 UTC m=+897.303164536" Feb 01 06:57:46 crc kubenswrapper[4546]: I0201 06:57:46.992562 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.087942 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts\") pod \"7af56bb5-2257-4f2f-97c8-a33236d55b81\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.088128 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs\") pod \"7af56bb5-2257-4f2f-97c8-a33236d55b81\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.088151 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle\") pod \"7af56bb5-2257-4f2f-97c8-a33236d55b81\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.088178 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data\") pod \"7af56bb5-2257-4f2f-97c8-a33236d55b81\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.088214 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6wvv\" (UniqueName: \"kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv\") pod \"7af56bb5-2257-4f2f-97c8-a33236d55b81\" (UID: \"7af56bb5-2257-4f2f-97c8-a33236d55b81\") " Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.088807 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs" (OuterVolumeSpecName: "logs") pod "7af56bb5-2257-4f2f-97c8-a33236d55b81" (UID: "7af56bb5-2257-4f2f-97c8-a33236d55b81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.097025 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv" (OuterVolumeSpecName: "kube-api-access-f6wvv") pod "7af56bb5-2257-4f2f-97c8-a33236d55b81" (UID: "7af56bb5-2257-4f2f-97c8-a33236d55b81"). InnerVolumeSpecName "kube-api-access-f6wvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.100948 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts" (OuterVolumeSpecName: "scripts") pod "7af56bb5-2257-4f2f-97c8-a33236d55b81" (UID: "7af56bb5-2257-4f2f-97c8-a33236d55b81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.117979 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7af56bb5-2257-4f2f-97c8-a33236d55b81" (UID: "7af56bb5-2257-4f2f-97c8-a33236d55b81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.158985 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data" (OuterVolumeSpecName: "config-data") pod "7af56bb5-2257-4f2f-97c8-a33236d55b81" (UID: "7af56bb5-2257-4f2f-97c8-a33236d55b81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.191666 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.192106 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af56bb5-2257-4f2f-97c8-a33236d55b81-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.192166 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.192228 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6wvv\" (UniqueName: \"kubernetes.io/projected/7af56bb5-2257-4f2f-97c8-a33236d55b81-kube-api-access-f6wvv\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.192284 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7af56bb5-2257-4f2f-97c8-a33236d55b81-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.611056 4546 generic.go:334] "Generic (PLEG): container finished" podID="156aa66f-373e-4f1f-bcb5-4a764235a839" containerID="d26c1a3c7b7135a987f7d5a19835eccee9bed2582a192dbe74791bf6131eec26" exitCode=0 Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.611126 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wt8z" event={"ID":"156aa66f-373e-4f1f-bcb5-4a764235a839","Type":"ContainerDied","Data":"d26c1a3c7b7135a987f7d5a19835eccee9bed2582a192dbe74791bf6131eec26"} Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.613339 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qjczq" event={"ID":"7af56bb5-2257-4f2f-97c8-a33236d55b81","Type":"ContainerDied","Data":"8c6963306ee5a846733476c4d3ca190dbb3c02a097cc0944d89d4f6a1fc5d3d2"} Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.613387 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c6963306ee5a846733476c4d3ca190dbb3c02a097cc0944d89d4f6a1fc5d3d2" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.613524 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qjczq" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.703929 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:57:47 crc kubenswrapper[4546]: E0201 06:57:47.704324 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" containerName="placement-db-sync" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.704339 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" containerName="placement-db-sync" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.704512 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" containerName="placement-db-sync" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.706387 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.714546 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.715818 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.717524 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.717577 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.717698 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-xcwwh" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.727113 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.810643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.810727 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.810772 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.810843 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfh9\" (UniqueName: \"kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.810957 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.811077 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.811101 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914283 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914347 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914374 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914413 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfh9\" (UniqueName: \"kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914458 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914497 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.914515 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.916273 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.921123 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.921836 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.928278 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.929678 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.939380 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:47 crc kubenswrapper[4546]: I0201 06:57:47.939715 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfh9\" (UniqueName: \"kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9\") pod \"placement-7587b5bb54-sqc4h\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:48 crc kubenswrapper[4546]: I0201 06:57:48.060258 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:48 crc kubenswrapper[4546]: I0201 06:57:48.750168 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.089134 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153024 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153119 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153164 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153234 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-998bg\" (UniqueName: \"kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153266 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.153374 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts\") pod \"156aa66f-373e-4f1f-bcb5-4a764235a839\" (UID: \"156aa66f-373e-4f1f-bcb5-4a764235a839\") " Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.164632 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts" (OuterVolumeSpecName: "scripts") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.168999 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg" (OuterVolumeSpecName: "kube-api-access-998bg") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "kube-api-access-998bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.170972 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.172960 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.212087 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data" (OuterVolumeSpecName: "config-data") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.252661 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156aa66f-373e-4f1f-bcb5-4a764235a839" (UID: "156aa66f-373e-4f1f-bcb5-4a764235a839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255661 4546 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255693 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255707 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-998bg\" (UniqueName: \"kubernetes.io/projected/156aa66f-373e-4f1f-bcb5-4a764235a839-kube-api-access-998bg\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255720 4546 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255727 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.255735 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156aa66f-373e-4f1f-bcb5-4a764235a839-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.638266 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4wt8z" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.638751 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4wt8z" event={"ID":"156aa66f-373e-4f1f-bcb5-4a764235a839","Type":"ContainerDied","Data":"d11ca99e425bd95ce6f1bf53a9c50bd9e556a1f9c1d390787346cabca6e3b3ef"} Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.638794 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11ca99e425bd95ce6f1bf53a9c50bd9e556a1f9c1d390787346cabca6e3b3ef" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.705225 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerStarted","Data":"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5"} Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.705258 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerStarted","Data":"fb4f3b5dd3310223b1799ba189beca45b1eae3f79b486f7516098e204fd0b1d8"} Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.764359 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-675d5f5fd9-ptdjf"] Feb 01 06:57:49 crc kubenswrapper[4546]: E0201 06:57:49.764758 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156aa66f-373e-4f1f-bcb5-4a764235a839" containerName="keystone-bootstrap" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.764771 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="156aa66f-373e-4f1f-bcb5-4a764235a839" containerName="keystone-bootstrap" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.765009 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="156aa66f-373e-4f1f-bcb5-4a764235a839" containerName="keystone-bootstrap" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.765547 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.778034 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-675d5f5fd9-ptdjf"] Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.781597 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.782175 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.804251 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q48f2" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.804595 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.804754 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.804902 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888108 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-scripts\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888153 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-combined-ca-bundle\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888307 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npqz\" (UniqueName: \"kubernetes.io/projected/3e8d588f-96dc-484b-a9c3-9fa403798d3e-kube-api-access-9npqz\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888365 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-fernet-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888395 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-internal-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888449 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-config-data\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888699 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-public-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:49 crc kubenswrapper[4546]: I0201 06:57:49.888746 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-credential-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993236 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npqz\" (UniqueName: \"kubernetes.io/projected/3e8d588f-96dc-484b-a9c3-9fa403798d3e-kube-api-access-9npqz\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993608 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-fernet-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993638 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-internal-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993678 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-config-data\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993876 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-public-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993907 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-credential-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993958 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-scripts\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.993976 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-combined-ca-bundle\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:49.999957 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-config-data\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.000979 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-combined-ca-bundle\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.004549 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-credential-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.004659 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-fernet-keys\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.005594 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-scripts\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.010531 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-internal-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.011028 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8d588f-96dc-484b-a9c3-9fa403798d3e-public-tls-certs\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.030408 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npqz\" (UniqueName: \"kubernetes.io/projected/3e8d588f-96dc-484b-a9c3-9fa403798d3e-kube-api-access-9npqz\") pod \"keystone-675d5f5fd9-ptdjf\" (UID: \"3e8d588f-96dc-484b-a9c3-9fa403798d3e\") " pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.114872 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.679967 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-675d5f5fd9-ptdjf"] Feb 01 06:57:50 crc kubenswrapper[4546]: W0201 06:57:50.699643 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8d588f_96dc_484b_a9c3_9fa403798d3e.slice/crio-47731e467aa92b77d2980ca73ff744f2fb46fdefc885467efbfee6eea97f62bf WatchSource:0}: Error finding container 47731e467aa92b77d2980ca73ff744f2fb46fdefc885467efbfee6eea97f62bf: Status 404 returned error can't find the container with id 47731e467aa92b77d2980ca73ff744f2fb46fdefc885467efbfee6eea97f62bf Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.706190 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerStarted","Data":"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6"} Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.706938 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.706985 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:57:50 crc kubenswrapper[4546]: I0201 06:57:50.734567 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7587b5bb54-sqc4h" podStartSLOduration=3.734546529 podStartE2EDuration="3.734546529s" podCreationTimestamp="2026-02-01 06:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:50.722279895 +0000 UTC m=+901.373215911" watchObservedRunningTime="2026-02-01 06:57:50.734546529 +0000 UTC m=+901.385482544" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.097595 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.571031 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.635507 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.635736 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="dnsmasq-dns" containerID="cri-o://4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e" gracePeriod=10 Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.736863 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-675d5f5fd9-ptdjf" event={"ID":"3e8d588f-96dc-484b-a9c3-9fa403798d3e","Type":"ContainerStarted","Data":"a47a17963e16e67dba7e175877b5184b6874ea77c3d01e0cd57c4b096ea03029"} Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.737177 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.737196 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-675d5f5fd9-ptdjf" event={"ID":"3e8d588f-96dc-484b-a9c3-9fa403798d3e","Type":"ContainerStarted","Data":"47731e467aa92b77d2980ca73ff744f2fb46fdefc885467efbfee6eea97f62bf"} Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.794328 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-675d5f5fd9-ptdjf" podStartSLOduration=2.79430438 podStartE2EDuration="2.79430438s" podCreationTimestamp="2026-02-01 06:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:57:51.771425612 +0000 UTC m=+902.422361628" watchObservedRunningTime="2026-02-01 06:57:51.79430438 +0000 UTC m=+902.445240396" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.896546 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-748cdb7884-m5r26" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.898668 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-748cdb7884-m5r26" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:51 crc kubenswrapper[4546]: I0201 06:57:51.898962 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-748cdb7884-m5r26" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.393396 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572727 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572809 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572832 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572889 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psh52\" (UniqueName: \"kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572917 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.572955 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config\") pod \"25e08b4c-97bb-43a5-b961-e2191859692d\" (UID: \"25e08b4c-97bb-43a5-b961-e2191859692d\") " Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.604014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52" (OuterVolumeSpecName: "kube-api-access-psh52") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "kube-api-access-psh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.678496 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psh52\" (UniqueName: \"kubernetes.io/projected/25e08b4c-97bb-43a5-b961-e2191859692d-kube-api-access-psh52\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.680342 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.708512 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.710947 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config" (OuterVolumeSpecName: "config") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.727235 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.763642 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25e08b4c-97bb-43a5-b961-e2191859692d" (UID: "25e08b4c-97bb-43a5-b961-e2191859692d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.781187 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.781544 4546 generic.go:334] "Generic (PLEG): container finished" podID="25e08b4c-97bb-43a5-b961-e2191859692d" containerID="4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e" exitCode=0 Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.781943 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.781955 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" event={"ID":"25e08b4c-97bb-43a5-b961-e2191859692d","Type":"ContainerDied","Data":"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e"} Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782000 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782015 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782028 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782032 4546 scope.go:117] "RemoveContainer" containerID="4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782039 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e08b4c-97bb-43a5-b961-e2191859692d-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.782018 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b885cfc67-gxrmd" event={"ID":"25e08b4c-97bb-43a5-b961-e2191859692d","Type":"ContainerDied","Data":"036315f6c4ebb17d9c4f7cae063ca825256a87af396917662d3571615f2112d3"} Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.810258 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:52 crc kubenswrapper[4546]: I0201 06:57:52.815252 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b885cfc67-gxrmd"] Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.089604 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.089663 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.131732 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.190844 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.560476 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.567241 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.605709 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.670571 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" path="/var/lib/kubelet/pods/25e08b4c-97bb-43a5-b961-e2191859692d/volumes" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.789547 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.789589 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.884767 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 01 06:57:53 crc kubenswrapper[4546]: I0201 06:57:53.994148 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5867f5bb44-shmxj" podUID="856b2577-3e14-4b6a-9480-9c49b57aad40" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.028531 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.028602 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.272651 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.279394 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.805512 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:54 crc kubenswrapper[4546]: I0201 06:57:54.805872 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:55 crc kubenswrapper[4546]: I0201 06:57:55.422301 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:57:55 crc kubenswrapper[4546]: I0201 06:57:55.422381 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:57:55 crc kubenswrapper[4546]: I0201 06:57:55.818533 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:57:55 crc kubenswrapper[4546]: I0201 06:57:55.818575 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:57:56 crc kubenswrapper[4546]: I0201 06:57:56.220899 4546 scope.go:117] "RemoveContainer" containerID="a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.581149 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.582373 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.729401 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.997107 4546 scope.go:117] "RemoveContainer" containerID="4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e" Feb 01 06:57:58 crc kubenswrapper[4546]: E0201 06:57:58.998695 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e\": container with ID starting with 4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e not found: ID does not exist" containerID="4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.998725 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e"} err="failed to get container status \"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e\": rpc error: code = NotFound desc = could not find container \"4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e\": container with ID starting with 4c578615b86fb94504514e997d505c7337b9416213db47356f9d4c994706232e not found: ID does not exist" Feb 01 06:57:58 crc kubenswrapper[4546]: I0201 06:57:58.998743 4546 scope.go:117] "RemoveContainer" containerID="a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765" Feb 01 06:57:59 crc kubenswrapper[4546]: E0201 06:57:59.001925 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765\": container with ID starting with a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765 not found: ID does not exist" containerID="a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.001950 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765"} err="failed to get container status \"a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765\": rpc error: code = NotFound desc = could not find container \"a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765\": container with ID starting with a50577b1f76219b1ce52e8c8aa89790a12039d67edb2b5b6a21d5b912760c765 not found: ID does not exist" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.740273 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.740684 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.883239 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerStarted","Data":"fe732eac3b0b024b973f4d60a23efb8d9e2182a1699c6d4a0b204cf9c53035e4"} Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.884780 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6ktch" event={"ID":"8b4a2956-c177-42f3-8981-830dbac77943","Type":"ContainerStarted","Data":"8a1cfa49fdc5ff1dbc4a657cffc212f55d16123cd836ab421475783c61e3cad9"} Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.887064 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pgw6x" event={"ID":"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6","Type":"ContainerStarted","Data":"0a4d32d91dc7b8a6390654f4a33444f520d817581ac9dd9e029b885d48bf0af0"} Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.906704 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-6ktch" podStartSLOduration=2.323708037 podStartE2EDuration="59.906694148s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:01.552165589 +0000 UTC m=+852.203101605" lastFinishedPulling="2026-02-01 06:57:59.1351517 +0000 UTC m=+909.786087716" observedRunningTime="2026-02-01 06:57:59.901399718 +0000 UTC m=+910.552335734" watchObservedRunningTime="2026-02-01 06:57:59.906694148 +0000 UTC m=+910.557630164" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.924506 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pgw6x" podStartSLOduration=3.172034692 podStartE2EDuration="59.924487601s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:02.382603791 +0000 UTC m=+853.033539807" lastFinishedPulling="2026-02-01 06:57:59.135056711 +0000 UTC m=+909.785992716" observedRunningTime="2026-02-01 06:57:59.917233877 +0000 UTC m=+910.568169893" watchObservedRunningTime="2026-02-01 06:57:59.924487601 +0000 UTC m=+910.575423617" Feb 01 06:57:59 crc kubenswrapper[4546]: I0201 06:57:59.976806 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 06:58:00 crc kubenswrapper[4546]: I0201 06:58:00.898035 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b9btc" event={"ID":"59c89483-60db-4db0-8957-32962d2a73b1","Type":"ContainerStarted","Data":"7387d0540462a56826d95378b0f343e5f40a5b9f2809ffe02c0191c1f245881e"} Feb 01 06:58:00 crc kubenswrapper[4546]: I0201 06:58:00.920204 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b9btc" podStartSLOduration=3.919987591 podStartE2EDuration="1m0.920188529s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:02.145079656 +0000 UTC m=+852.796015672" lastFinishedPulling="2026-02-01 06:57:59.145280593 +0000 UTC m=+909.796216610" observedRunningTime="2026-02-01 06:58:00.914969331 +0000 UTC m=+911.565905346" watchObservedRunningTime="2026-02-01 06:58:00.920188529 +0000 UTC m=+911.571124545" Feb 01 06:58:03 crc kubenswrapper[4546]: I0201 06:58:03.879182 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 01 06:58:03 crc kubenswrapper[4546]: I0201 06:58:03.924071 4546 generic.go:334] "Generic (PLEG): container finished" podID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" containerID="0a4d32d91dc7b8a6390654f4a33444f520d817581ac9dd9e029b885d48bf0af0" exitCode=0 Feb 01 06:58:03 crc kubenswrapper[4546]: I0201 06:58:03.924123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pgw6x" event={"ID":"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6","Type":"ContainerDied","Data":"0a4d32d91dc7b8a6390654f4a33444f520d817581ac9dd9e029b885d48bf0af0"} Feb 01 06:58:03 crc kubenswrapper[4546]: I0201 06:58:03.989299 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5867f5bb44-shmxj" podUID="856b2577-3e14-4b6a-9480-9c49b57aad40" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 01 06:58:04 crc kubenswrapper[4546]: I0201 06:58:04.934972 4546 generic.go:334] "Generic (PLEG): container finished" podID="8b4a2956-c177-42f3-8981-830dbac77943" containerID="8a1cfa49fdc5ff1dbc4a657cffc212f55d16123cd836ab421475783c61e3cad9" exitCode=0 Feb 01 06:58:04 crc kubenswrapper[4546]: I0201 06:58:04.935038 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6ktch" event={"ID":"8b4a2956-c177-42f3-8981-830dbac77943","Type":"ContainerDied","Data":"8a1cfa49fdc5ff1dbc4a657cffc212f55d16123cd836ab421475783c61e3cad9"} Feb 01 06:58:05 crc kubenswrapper[4546]: I0201 06:58:05.946264 4546 generic.go:334] "Generic (PLEG): container finished" podID="59c89483-60db-4db0-8957-32962d2a73b1" containerID="7387d0540462a56826d95378b0f343e5f40a5b9f2809ffe02c0191c1f245881e" exitCode=0 Feb 01 06:58:05 crc kubenswrapper[4546]: I0201 06:58:05.947090 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b9btc" event={"ID":"59c89483-60db-4db0-8957-32962d2a73b1","Type":"ContainerDied","Data":"7387d0540462a56826d95378b0f343e5f40a5b9f2809ffe02c0191c1f245881e"} Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.740548 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b9btc" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.784086 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.799250 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6ktch" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.814779 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.814882 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.814918 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.814943 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815064 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28gf\" (UniqueName: \"kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf\") pod \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815106 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvvqs\" (UniqueName: \"kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815141 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle\") pod \"8b4a2956-c177-42f3-8981-830dbac77943\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815207 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle\") pod \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815246 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp4g7\" (UniqueName: \"kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7\") pod \"8b4a2956-c177-42f3-8981-830dbac77943\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815287 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815323 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle\") pod \"59c89483-60db-4db0-8957-32962d2a73b1\" (UID: \"59c89483-60db-4db0-8957-32962d2a73b1\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815367 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data\") pod \"8b4a2956-c177-42f3-8981-830dbac77943\" (UID: \"8b4a2956-c177-42f3-8981-830dbac77943\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815397 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data\") pod \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\" (UID: \"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6\") " Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.815929 4546 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/59c89483-60db-4db0-8957-32962d2a73b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.858166 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" (UID: "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.858434 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf" (OuterVolumeSpecName: "kube-api-access-f28gf") pod "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" (UID: "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6"). InnerVolumeSpecName "kube-api-access-f28gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.862036 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts" (OuterVolumeSpecName: "scripts") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.862994 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7" (OuterVolumeSpecName: "kube-api-access-cp4g7") pod "8b4a2956-c177-42f3-8981-830dbac77943" (UID: "8b4a2956-c177-42f3-8981-830dbac77943"). InnerVolumeSpecName "kube-api-access-cp4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.866681 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs" (OuterVolumeSpecName: "kube-api-access-wvvqs") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "kube-api-access-wvvqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.867347 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.887656 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.890087 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b4a2956-c177-42f3-8981-830dbac77943" (UID: "8b4a2956-c177-42f3-8981-830dbac77943"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.907715 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" (UID: "91d86af3-9b64-4ebd-ac39-e2063ea7c9b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.917383 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.917607 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp4g7\" (UniqueName: \"kubernetes.io/projected/8b4a2956-c177-42f3-8981-830dbac77943-kube-api-access-cp4g7\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.917838 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.917915 4546 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.917973 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.918025 4546 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.918071 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28gf\" (UniqueName: \"kubernetes.io/projected/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6-kube-api-access-f28gf\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.918121 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvvqs\" (UniqueName: \"kubernetes.io/projected/59c89483-60db-4db0-8957-32962d2a73b1-kube-api-access-wvvqs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.918172 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.935411 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data" (OuterVolumeSpecName: "config-data") pod "59c89483-60db-4db0-8957-32962d2a73b1" (UID: "59c89483-60db-4db0-8957-32962d2a73b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.957063 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data" (OuterVolumeSpecName: "config-data") pod "8b4a2956-c177-42f3-8981-830dbac77943" (UID: "8b4a2956-c177-42f3-8981-830dbac77943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.989918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerStarted","Data":"cc02b6b9ba589cae973a7abffbbd6564dd4c6e4bdba7743789ceaad408f98e15"} Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.990547 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-central-agent" containerID="cri-o://07590c57da60555fe686858a2df6c9fc569ea928439e69fe7aecfb572f0003eb" gracePeriod=30 Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.990781 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.991288 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="proxy-httpd" containerID="cri-o://cc02b6b9ba589cae973a7abffbbd6564dd4c6e4bdba7743789ceaad408f98e15" gracePeriod=30 Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.991314 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-notification-agent" containerID="cri-o://4a5f15bd1d7835c016f46c2f196f9d2d2ae66c2104844c833cfa3d78a502e4a4" gracePeriod=30 Feb 01 06:58:10 crc kubenswrapper[4546]: I0201 06:58:10.991483 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="sg-core" containerID="cri-o://fe732eac3b0b024b973f4d60a23efb8d9e2182a1699c6d4a0b204cf9c53035e4" gracePeriod=30 Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.002442 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-6ktch" event={"ID":"8b4a2956-c177-42f3-8981-830dbac77943","Type":"ContainerDied","Data":"0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf"} Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.002484 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a8ef7f47c50e5545c89b58cb195bc83798374fc862209edf14722334145d7cf" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.003226 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-6ktch" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.006775 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b9btc" event={"ID":"59c89483-60db-4db0-8957-32962d2a73b1","Type":"ContainerDied","Data":"cb6ea4acf4ed0b6c1543d8a08b901f0c6091063173bc75fca93c706e924b2753"} Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.006821 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb6ea4acf4ed0b6c1543d8a08b901f0c6091063173bc75fca93c706e924b2753" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.006935 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b9btc" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.014686 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pgw6x" event={"ID":"91d86af3-9b64-4ebd-ac39-e2063ea7c9b6","Type":"ContainerDied","Data":"2f875253b9bbda7747a9df25c7280e2629681432a3a4d058e645c2e832b7563c"} Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.014786 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pgw6x" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.014788 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f875253b9bbda7747a9df25c7280e2629681432a3a4d058e645c2e832b7563c" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.019993 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.568140566 podStartE2EDuration="1m11.019971358s" podCreationTimestamp="2026-02-01 06:57:00 +0000 UTC" firstStartedPulling="2026-02-01 06:57:02.21011329 +0000 UTC m=+852.861049306" lastFinishedPulling="2026-02-01 06:58:10.661944082 +0000 UTC m=+921.312880098" observedRunningTime="2026-02-01 06:58:11.014522285 +0000 UTC m=+921.665458291" watchObservedRunningTime="2026-02-01 06:58:11.019971358 +0000 UTC m=+921.670907373" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.021309 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59c89483-60db-4db0-8957-32962d2a73b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.021341 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b4a2956-c177-42f3-8981-830dbac77943-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.850989 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.941078 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwgcn\" (UniqueName: \"kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn\") pod \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.941201 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key\") pod \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.941272 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data\") pod \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.941400 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs\") pod \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.941534 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts\") pod \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\" (UID: \"3499bb03-a1f8-4eef-b0da-3e1b3deb224d\") " Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.942745 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs" (OuterVolumeSpecName: "logs") pod "3499bb03-a1f8-4eef-b0da-3e1b3deb224d" (UID: "3499bb03-a1f8-4eef-b0da-3e1b3deb224d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.949624 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3499bb03-a1f8-4eef-b0da-3e1b3deb224d" (UID: "3499bb03-a1f8-4eef-b0da-3e1b3deb224d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.966107 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn" (OuterVolumeSpecName: "kube-api-access-gwgcn") pod "3499bb03-a1f8-4eef-b0da-3e1b3deb224d" (UID: "3499bb03-a1f8-4eef-b0da-3e1b3deb224d"). InnerVolumeSpecName "kube-api-access-gwgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.989692 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts" (OuterVolumeSpecName: "scripts") pod "3499bb03-a1f8-4eef-b0da-3e1b3deb224d" (UID: "3499bb03-a1f8-4eef-b0da-3e1b3deb224d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.993755 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994230 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon-log" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994254 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon-log" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994264 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" containerName="barbican-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994271 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" containerName="barbican-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994280 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4a2956-c177-42f3-8981-830dbac77943" containerName="heat-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994288 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4a2956-c177-42f3-8981-830dbac77943" containerName="heat-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994303 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59c89483-60db-4db0-8957-32962d2a73b1" containerName="cinder-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994309 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c89483-60db-4db0-8957-32962d2a73b1" containerName="cinder-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994341 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994349 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994363 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="init" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994369 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="init" Feb 01 06:58:11 crc kubenswrapper[4546]: E0201 06:58:11.994378 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="dnsmasq-dns" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994385 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="dnsmasq-dns" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994562 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4a2956-c177-42f3-8981-830dbac77943" containerName="heat-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994573 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e08b4c-97bb-43a5-b961-e2191859692d" containerName="dnsmasq-dns" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994593 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon-log" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994609 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerName="horizon" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994621 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="59c89483-60db-4db0-8957-32962d2a73b1" containerName="cinder-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.994629 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" containerName="barbican-db-sync" Feb 01 06:58:11 crc kubenswrapper[4546]: I0201 06:58:11.995561 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.003069 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data" (OuterVolumeSpecName: "config-data") pod "3499bb03-a1f8-4eef-b0da-3e1b3deb224d" (UID: "3499bb03-a1f8-4eef-b0da-3e1b3deb224d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.003404 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.006494 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.006626 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.021719 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.033578 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jn4mm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051233 4546 generic.go:334] "Generic (PLEG): container finished" podID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerID="dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" exitCode=137 Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051266 4546 generic.go:334] "Generic (PLEG): container finished" podID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" containerID="d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" exitCode=137 Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051317 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerDied","Data":"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1"} Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051334 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bf8cbd6d5-wjq5d" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051350 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerDied","Data":"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4"} Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051362 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bf8cbd6d5-wjq5d" event={"ID":"3499bb03-a1f8-4eef-b0da-3e1b3deb224d","Type":"ContainerDied","Data":"597d0910156a02e5eb678220d345fa56948bd49c473b685d21f88fbc0f779d79"} Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.051380 4546 scope.go:117] "RemoveContainer" containerID="dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053309 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053365 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053571 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfc8\" (UniqueName: \"kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053630 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053668 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053687 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053828 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053845 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwgcn\" (UniqueName: \"kubernetes.io/projected/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-kube-api-access-gwgcn\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053908 4546 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053918 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.053927 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3499bb03-a1f8-4eef-b0da-3e1b3deb224d-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.081149 4546 generic.go:334] "Generic (PLEG): container finished" podID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerID="fe732eac3b0b024b973f4d60a23efb8d9e2182a1699c6d4a0b204cf9c53035e4" exitCode=2 Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.081182 4546 generic.go:334] "Generic (PLEG): container finished" podID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerID="07590c57da60555fe686858a2df6c9fc569ea928439e69fe7aecfb572f0003eb" exitCode=0 Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.081206 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerDied","Data":"fe732eac3b0b024b973f4d60a23efb8d9e2182a1699c6d4a0b204cf9c53035e4"} Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.081233 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerDied","Data":"07590c57da60555fe686858a2df6c9fc569ea928439e69fe7aecfb572f0003eb"} Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.139211 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.154636 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bf8cbd6d5-wjq5d"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.157786 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfc8\" (UniqueName: \"kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.161261 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.161369 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.161468 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.161777 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.161903 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.166517 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.166824 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.175156 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.178063 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.182014 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.204175 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f5ff96685-clxfz"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.205896 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.210540 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfc8\" (UniqueName: \"kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8\") pod \"cinder-scheduler-0\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.211091 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.211361 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.211504 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-k2llf" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.211609 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.213115 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.236451 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.246920 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f5ff96685-clxfz"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265189 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265254 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99aa72c9-fa66-42d9-bc96-f900f97bc81c-logs\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265298 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-logs\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265319 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chq47\" (UniqueName: \"kubernetes.io/projected/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-kube-api-access-chq47\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265338 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-combined-ca-bundle\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265404 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data-custom\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265423 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjb9\" (UniqueName: \"kubernetes.io/projected/99aa72c9-fa66-42d9-bc96-f900f97bc81c-kube-api-access-fbjb9\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265449 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-combined-ca-bundle\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265480 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data-custom\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.265533 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.278010 4546 scope.go:117] "RemoveContainer" containerID="d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.287811 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.321908 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d54fc98bf-7sqvm"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.323375 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.350570 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d54fc98bf-7sqvm"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.361516 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375088 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data-custom\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375126 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjb9\" (UniqueName: \"kubernetes.io/projected/99aa72c9-fa66-42d9-bc96-f900f97bc81c-kube-api-access-fbjb9\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375169 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-combined-ca-bundle\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375213 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data-custom\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375258 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375287 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375345 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375372 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375394 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375416 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375451 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99aa72c9-fa66-42d9-bc96-f900f97bc81c-logs\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375490 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-logs\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375512 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chq47\" (UniqueName: \"kubernetes.io/projected/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-kube-api-access-chq47\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375530 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375549 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-combined-ca-bundle\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.375622 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwjh\" (UniqueName: \"kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.376761 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-logs\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.377048 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99aa72c9-fa66-42d9-bc96-f900f97bc81c-logs\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.385215 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-combined-ca-bundle\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.388393 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data-custom\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.388489 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data-custom\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.389166 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-combined-ca-bundle\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.392048 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99aa72c9-fa66-42d9-bc96-f900f97bc81c-config-data\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.393273 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-config-data\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.398105 4546 scope.go:117] "RemoveContainer" containerID="dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" Feb 01 06:58:12 crc kubenswrapper[4546]: E0201 06:58:12.404237 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1\": container with ID starting with dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1 not found: ID does not exist" containerID="dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.404273 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1"} err="failed to get container status \"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1\": rpc error: code = NotFound desc = could not find container \"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1\": container with ID starting with dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1 not found: ID does not exist" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.404292 4546 scope.go:117] "RemoveContainer" containerID="d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" Feb 01 06:58:12 crc kubenswrapper[4546]: E0201 06:58:12.408714 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4\": container with ID starting with d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4 not found: ID does not exist" containerID="d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.408739 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4"} err="failed to get container status \"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4\": rpc error: code = NotFound desc = could not find container \"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4\": container with ID starting with d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4 not found: ID does not exist" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.408753 4546 scope.go:117] "RemoveContainer" containerID="dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.416850 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1"} err="failed to get container status \"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1\": rpc error: code = NotFound desc = could not find container \"dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1\": container with ID starting with dd35323a171b9438cf723a4c0404054968338bf1dc3b4a45cd3b5df0f9063ea1 not found: ID does not exist" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.416902 4546 scope.go:117] "RemoveContainer" containerID="d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.417320 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4"} err="failed to get container status \"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4\": rpc error: code = NotFound desc = could not find container \"d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4\": container with ID starting with d4dea8f419d87b1870a8ebcb52da0d35a4b466f3fd24d26c24065c4f37d6bcf4 not found: ID does not exist" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.435802 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chq47\" (UniqueName: \"kubernetes.io/projected/2ade6ab0-0cf6-4ebf-aa62-ed547a877d48-kube-api-access-chq47\") pod \"barbican-worker-6f5ff96685-clxfz\" (UID: \"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48\") " pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.445348 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjb9\" (UniqueName: \"kubernetes.io/projected/99aa72c9-fa66-42d9-bc96-f900f97bc81c-kube-api-access-fbjb9\") pod \"barbican-keystone-listener-6cbd87f6bb-p2x5b\" (UID: \"99aa72c9-fa66-42d9-bc96-f900f97bc81c\") " pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479191 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479337 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwjh\" (UniqueName: \"kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479443 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479546 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479618 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.479683 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.480951 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.481392 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.481685 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.482043 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.482366 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.546366 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d54fc98bf-7sqvm"] Feb 01 06:58:12 crc kubenswrapper[4546]: E0201 06:58:12.547375 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-htwjh], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" podUID="071babef-eb54-472c-8668-4f856d2d3438" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.590620 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f5ff96685-clxfz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.603091 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwjh\" (UniqueName: \"kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh\") pod \"dnsmasq-dns-5d54fc98bf-7sqvm\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.609387 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.658652 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.671666 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.783646 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787369 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787435 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787504 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787530 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787808 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.787922 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjmdd\" (UniqueName: \"kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.832967 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.839641 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.846346 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.873267 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.892047 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.893609 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896191 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896336 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896365 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896385 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896449 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896489 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjmdd\" (UniqueName: \"kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896517 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896618 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896674 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896732 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896810 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjpn\" (UniqueName: \"kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.896977 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.899028 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.899609 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.899775 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.900393 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.900448 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.900842 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.947448 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.977576 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjmdd\" (UniqueName: \"kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd\") pod \"dnsmasq-dns-f9ff4476f-89c94\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.999633 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.999686 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.999717 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.999739 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:12 crc kubenswrapper[4546]: I0201 06:58:12.999759 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000498 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000579 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000616 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000727 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000781 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjpn\" (UniqueName: \"kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000798 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7lmj\" (UniqueName: \"kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.000827 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.001297 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.003376 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.005842 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.009457 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.017508 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.023784 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.043887 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjpn\" (UniqueName: \"kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn\") pod \"cinder-api-0\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.103165 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.103419 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.103444 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.103533 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.103658 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7lmj\" (UniqueName: \"kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.109394 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.110414 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.115657 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.124259 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.125398 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.127995 4546 generic.go:334] "Generic (PLEG): container finished" podID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerID="b214ed0d0bf3c06ce016b363c648a64db0bbe04f0f3440156b319469962cf096" exitCode=137 Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.128024 4546 generic.go:334] "Generic (PLEG): container finished" podID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerID="ffcaf1e1ba2cc34592de70c4e17dd95cec56248aaa3690ae830c9e46caaf5127" exitCode=137 Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.128077 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerDied","Data":"b214ed0d0bf3c06ce016b363c648a64db0bbe04f0f3440156b319469962cf096"} Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.128107 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerDied","Data":"ffcaf1e1ba2cc34592de70c4e17dd95cec56248aaa3690ae830c9e46caaf5127"} Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.129967 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.143951 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7lmj\" (UniqueName: \"kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj\") pod \"barbican-api-fbbc88988-qj7hz\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.156758 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.159827 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.205044 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206274 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206372 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206402 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwjh\" (UniqueName: \"kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206453 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206479 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0\") pod \"071babef-eb54-472c-8668-4f856d2d3438\" (UID: \"071babef-eb54-472c-8668-4f856d2d3438\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.206135 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config" (OuterVolumeSpecName: "config") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.207209 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.209250 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.209581 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.213034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.221797 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh" (OuterVolumeSpecName: "kube-api-access-htwjh") pod "071babef-eb54-472c-8668-4f856d2d3438" (UID: "071babef-eb54-472c-8668-4f856d2d3438"). InnerVolumeSpecName "kube-api-access-htwjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.232408 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309292 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309326 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309338 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309364 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htwjh\" (UniqueName: \"kubernetes.io/projected/071babef-eb54-472c-8668-4f856d2d3438-kube-api-access-htwjh\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309373 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.309383 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071babef-eb54-472c-8668-4f856d2d3438-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.319409 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.515871 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b"] Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.586409 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.672553 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3499bb03-a1f8-4eef-b0da-3e1b3deb224d" path="/var/lib/kubelet/pods/3499bb03-a1f8-4eef-b0da-3e1b3deb224d/volumes" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.699460 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f5ff96685-clxfz"] Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.717917 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key\") pod \"3ad13b31-fc9b-4e58-97f5-35f208029aad\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.717963 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs\") pod \"3ad13b31-fc9b-4e58-97f5-35f208029aad\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.718133 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvsmv\" (UniqueName: \"kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv\") pod \"3ad13b31-fc9b-4e58-97f5-35f208029aad\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.718188 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts\") pod \"3ad13b31-fc9b-4e58-97f5-35f208029aad\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.718284 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data\") pod \"3ad13b31-fc9b-4e58-97f5-35f208029aad\" (UID: \"3ad13b31-fc9b-4e58-97f5-35f208029aad\") " Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.720537 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs" (OuterVolumeSpecName: "logs") pod "3ad13b31-fc9b-4e58-97f5-35f208029aad" (UID: "3ad13b31-fc9b-4e58-97f5-35f208029aad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.752962 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3ad13b31-fc9b-4e58-97f5-35f208029aad" (UID: "3ad13b31-fc9b-4e58-97f5-35f208029aad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.757083 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv" (OuterVolumeSpecName: "kube-api-access-wvsmv") pod "3ad13b31-fc9b-4e58-97f5-35f208029aad" (UID: "3ad13b31-fc9b-4e58-97f5-35f208029aad"). InnerVolumeSpecName "kube-api-access-wvsmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.778411 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts" (OuterVolumeSpecName: "scripts") pod "3ad13b31-fc9b-4e58-97f5-35f208029aad" (UID: "3ad13b31-fc9b-4e58-97f5-35f208029aad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.778884 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data" (OuterVolumeSpecName: "config-data") pod "3ad13b31-fc9b-4e58-97f5-35f208029aad" (UID: "3ad13b31-fc9b-4e58-97f5-35f208029aad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.821630 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvsmv\" (UniqueName: \"kubernetes.io/projected/3ad13b31-fc9b-4e58-97f5-35f208029aad-kube-api-access-wvsmv\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.821662 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.821679 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ad13b31-fc9b-4e58-97f5-35f208029aad-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.821688 4546 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ad13b31-fc9b-4e58-97f5-35f208029aad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.821696 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad13b31-fc9b-4e58-97f5-35f208029aad-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:13 crc kubenswrapper[4546]: W0201 06:58:13.840291 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95418b3b_b693_4b25_8ce8_967d233a1e54.slice/crio-886881be3d1b8be6b17f7ee9aa05deef182eec19eb06d797cb876a80f2207cdd WatchSource:0}: Error finding container 886881be3d1b8be6b17f7ee9aa05deef182eec19eb06d797cb876a80f2207cdd: Status 404 returned error can't find the container with id 886881be3d1b8be6b17f7ee9aa05deef182eec19eb06d797cb876a80f2207cdd Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.856413 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.865757 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.879142 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.880110 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.881185 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce"} pod="openstack/horizon-7c8bd8cd6b-vfjlr" containerMessage="Container horizon failed startup probe, will be restarted" Feb 01 06:58:13 crc kubenswrapper[4546]: I0201 06:58:13.881297 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" containerID="cri-o://7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce" gracePeriod=30 Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.039724 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.140396 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerStarted","Data":"8fc67767fc9a402b07e0c33782afcd23bf6bf077efd2dc0a1a0e69091cde2ab6"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.144914 4546 generic.go:334] "Generic (PLEG): container finished" podID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerID="fce29137df87e1ed3a51aaedb21353046645dcb491faa2db5f415d99a8f6b3d8" exitCode=0 Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.144964 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" event={"ID":"95418b3b-b693-4b25-8ce8-967d233a1e54","Type":"ContainerDied","Data":"fce29137df87e1ed3a51aaedb21353046645dcb491faa2db5f415d99a8f6b3d8"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.144981 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" event={"ID":"95418b3b-b693-4b25-8ce8-967d233a1e54","Type":"ContainerStarted","Data":"886881be3d1b8be6b17f7ee9aa05deef182eec19eb06d797cb876a80f2207cdd"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.161021 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55cb447c8f-m8jw2" event={"ID":"3ad13b31-fc9b-4e58-97f5-35f208029aad","Type":"ContainerDied","Data":"3f6c8429cc258b36dfd36a3f0121da74e51f90ccb6c7fd613a62003bcf30e6a0"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.161055 4546 scope.go:117] "RemoveContainer" containerID="b214ed0d0bf3c06ce016b363c648a64db0bbe04f0f3440156b319469962cf096" Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.161146 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55cb447c8f-m8jw2" Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.172104 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerStarted","Data":"d6192ee938ca066eeee6cd52764623f66b501e7ca1089c2964f8b57c29fa56ab"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.183089 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerStarted","Data":"b8b5cfa9bbb5327165ea2c646588515a8b8cc86a25bafb6b356d4ef6b1f56af1"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.184921 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" event={"ID":"99aa72c9-fa66-42d9-bc96-f900f97bc81c","Type":"ContainerStarted","Data":"78b07f198474427e90a334d5847fbce65f890dbc6569e51e172d00a39105317c"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.186660 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d54fc98bf-7sqvm" Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.186837 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5ff96685-clxfz" event={"ID":"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48","Type":"ContainerStarted","Data":"07b14c9260986cf764e8d148c8c5d69c1753aeb31b0ebe927b3a7112aaff40e8"} Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.200775 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.210756 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55cb447c8f-m8jw2"] Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.274876 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d54fc98bf-7sqvm"] Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.287604 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d54fc98bf-7sqvm"] Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.521545 4546 scope.go:117] "RemoveContainer" containerID="ffcaf1e1ba2cc34592de70c4e17dd95cec56248aaa3690ae830c9e46caaf5127" Feb 01 06:58:14 crc kubenswrapper[4546]: I0201 06:58:14.992373 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.205966 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerStarted","Data":"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551"} Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.211370 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" event={"ID":"95418b3b-b693-4b25-8ce8-967d233a1e54","Type":"ContainerStarted","Data":"c01ec8ed21578560d39dd4afe1e10dbffedb3b6af61d452e12535b725db11c5a"} Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.213026 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.222230 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerStarted","Data":"ef63fcb83333d4554b5d22233f18fab3499ef67fdecb9a66a44a0fd3cc78e3b4"} Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.222251 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerStarted","Data":"6a1ed3909dc2ba2f0aab65e710c5ca7444f2519e140471d1df2c9537f4df77e3"} Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.223124 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.223150 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.261221 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" podStartSLOduration=3.261202441 podStartE2EDuration="3.261202441s" podCreationTimestamp="2026-02-01 06:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:15.237240821 +0000 UTC m=+925.888176837" watchObservedRunningTime="2026-02-01 06:58:15.261202441 +0000 UTC m=+925.912138457" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.263973 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-fbbc88988-qj7hz" podStartSLOduration=3.26396545 podStartE2EDuration="3.26396545s" podCreationTimestamp="2026-02-01 06:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:15.249663971 +0000 UTC m=+925.900599976" watchObservedRunningTime="2026-02-01 06:58:15.26396545 +0000 UTC m=+925.914901456" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.668453 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071babef-eb54-472c-8668-4f856d2d3438" path="/var/lib/kubelet/pods/071babef-eb54-472c-8668-4f856d2d3438/volumes" Feb 01 06:58:15 crc kubenswrapper[4546]: I0201 06:58:15.669166 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" path="/var/lib/kubelet/pods/3ad13b31-fc9b-4e58-97f5-35f208029aad/volumes" Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.232998 4546 generic.go:334] "Generic (PLEG): container finished" podID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerID="4a5f15bd1d7835c016f46c2f196f9d2d2ae66c2104844c833cfa3d78a502e4a4" exitCode=0 Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.233075 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerDied","Data":"4a5f15bd1d7835c016f46c2f196f9d2d2ae66c2104844c833cfa3d78a502e4a4"} Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.239790 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerStarted","Data":"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43"} Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.240004 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api-log" containerID="cri-o://9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" gracePeriod=30 Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.240124 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.240210 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api" containerID="cri-o://b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" gracePeriod=30 Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.247493 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerStarted","Data":"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f"} Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.262934 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.262917627 podStartE2EDuration="4.262917627s" podCreationTimestamp="2026-02-01 06:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:16.254629834 +0000 UTC m=+926.905565850" watchObservedRunningTime="2026-02-01 06:58:16.262917627 +0000 UTC m=+926.913853643" Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.696138 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:58:16 crc kubenswrapper[4546]: I0201 06:58:16.880505 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.021462 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.021647 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.021739 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.021825 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhjpn\" (UniqueName: \"kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.022346 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.022396 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.022433 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id\") pod \"514e1b1d-bc66-482c-8b2f-6786671ce639\" (UID: \"514e1b1d-bc66-482c-8b2f-6786671ce639\") " Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.022700 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs" (OuterVolumeSpecName: "logs") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.022819 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.023400 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/514e1b1d-bc66-482c-8b2f-6786671ce639-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.023498 4546 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/514e1b1d-bc66-482c-8b2f-6786671ce639-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.028791 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.029053 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn" (OuterVolumeSpecName: "kube-api-access-xhjpn") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "kube-api-access-xhjpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.033320 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts" (OuterVolumeSpecName: "scripts") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.048990 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.078798 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data" (OuterVolumeSpecName: "config-data") pod "514e1b1d-bc66-482c-8b2f-6786671ce639" (UID: "514e1b1d-bc66-482c-8b2f-6786671ce639"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.127940 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.127970 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.127984 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhjpn\" (UniqueName: \"kubernetes.io/projected/514e1b1d-bc66-482c-8b2f-6786671ce639-kube-api-access-xhjpn\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.127994 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.128003 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/514e1b1d-bc66-482c-8b2f-6786671ce639-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.268591 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerStarted","Data":"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.271834 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" event={"ID":"99aa72c9-fa66-42d9-bc96-f900f97bc81c","Type":"ContainerStarted","Data":"e4c28c082c5c97b67a5f6cca4b11eca0eb94bfc7c7af8a89e7d90566a0beda70"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.271915 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" event={"ID":"99aa72c9-fa66-42d9-bc96-f900f97bc81c","Type":"ContainerStarted","Data":"39fd2b7d271683ddb71b911cac6f44dfaa19bff3d52a95c12679d4695a07fdc4"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.273687 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5ff96685-clxfz" event={"ID":"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48","Type":"ContainerStarted","Data":"3f2117262413cfe909099b17a2275a62e617d2d05ac699c73ac8cead3e6ad0b2"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.273721 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f5ff96685-clxfz" event={"ID":"2ade6ab0-0cf6-4ebf-aa62-ed547a877d48","Type":"ContainerStarted","Data":"7ae4127461cc0d6786e73e3fa2fe5a196dfc3b498a4fe5031d6d8d87f1168afc"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.275847 4546 generic.go:334] "Generic (PLEG): container finished" podID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerID="b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" exitCode=0 Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.275898 4546 generic.go:334] "Generic (PLEG): container finished" podID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerID="9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" exitCode=143 Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.275979 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.275960 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerDied","Data":"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.276066 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerDied","Data":"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.276082 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"514e1b1d-bc66-482c-8b2f-6786671ce639","Type":"ContainerDied","Data":"8fc67767fc9a402b07e0c33782afcd23bf6bf077efd2dc0a1a0e69091cde2ab6"} Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.276108 4546 scope.go:117] "RemoveContainer" containerID="b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.303153 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.997065591 podStartE2EDuration="6.303124994s" podCreationTimestamp="2026-02-01 06:58:11 +0000 UTC" firstStartedPulling="2026-02-01 06:58:13.279741574 +0000 UTC m=+923.930677590" lastFinishedPulling="2026-02-01 06:58:14.585800977 +0000 UTC m=+925.236736993" observedRunningTime="2026-02-01 06:58:17.297218851 +0000 UTC m=+927.948154867" watchObservedRunningTime="2026-02-01 06:58:17.303124994 +0000 UTC m=+927.954061011" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.307958 4546 scope.go:117] "RemoveContainer" containerID="9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.356359 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f5ff96685-clxfz" podStartSLOduration=2.749058203 podStartE2EDuration="5.356343327s" podCreationTimestamp="2026-02-01 06:58:12 +0000 UTC" firstStartedPulling="2026-02-01 06:58:13.713835251 +0000 UTC m=+924.364771257" lastFinishedPulling="2026-02-01 06:58:16.321120365 +0000 UTC m=+926.972056381" observedRunningTime="2026-02-01 06:58:17.334987408 +0000 UTC m=+927.985923425" watchObservedRunningTime="2026-02-01 06:58:17.356343327 +0000 UTC m=+928.007279342" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.362064 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.365177 4546 scope.go:117] "RemoveContainer" containerID="b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.367564 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43\": container with ID starting with b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43 not found: ID does not exist" containerID="b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.367603 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43"} err="failed to get container status \"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43\": rpc error: code = NotFound desc = could not find container \"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43\": container with ID starting with b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43 not found: ID does not exist" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.367623 4546 scope.go:117] "RemoveContainer" containerID="9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.373160 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6cbd87f6bb-p2x5b" podStartSLOduration=2.572252096 podStartE2EDuration="5.373148175s" podCreationTimestamp="2026-02-01 06:58:12 +0000 UTC" firstStartedPulling="2026-02-01 06:58:13.522579665 +0000 UTC m=+924.173515682" lastFinishedPulling="2026-02-01 06:58:16.323475745 +0000 UTC m=+926.974411761" observedRunningTime="2026-02-01 06:58:17.3571517 +0000 UTC m=+928.008087717" watchObservedRunningTime="2026-02-01 06:58:17.373148175 +0000 UTC m=+928.024084180" Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.378055 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551\": container with ID starting with 9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551 not found: ID does not exist" containerID="9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.378097 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551"} err="failed to get container status \"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551\": rpc error: code = NotFound desc = could not find container \"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551\": container with ID starting with 9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551 not found: ID does not exist" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.378123 4546 scope.go:117] "RemoveContainer" containerID="b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.382019 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43"} err="failed to get container status \"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43\": rpc error: code = NotFound desc = could not find container \"b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43\": container with ID starting with b95980249656360b7a4e0831e92be155d8d6f4f199f24c0a0c15577763809c43 not found: ID does not exist" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.382073 4546 scope.go:117] "RemoveContainer" containerID="9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.383910 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551"} err="failed to get container status \"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551\": rpc error: code = NotFound desc = could not find container \"9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551\": container with ID starting with 9c955f304c9de333092242af20aad1e6991a3e607615255842f78b06c712e551 not found: ID does not exist" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.408778 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.424900 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.449580 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.450199 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon-log" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450220 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon-log" Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.450243 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450251 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api" Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.450265 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api-log" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450272 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api-log" Feb 01 06:58:17 crc kubenswrapper[4546]: E0201 06:58:17.450282 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450288 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450480 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450499 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450509 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" containerName="cinder-api-log" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.450518 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad13b31-fc9b-4e58-97f5-35f208029aad" containerName="horizon-log" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.451732 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.454030 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.454174 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.454344 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.458622 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.547283 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.547608 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtd2x\" (UniqueName: \"kubernetes.io/projected/0b22c05c-eab7-40a4-bdd9-3c253897979d-kube-api-access-xtd2x\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.547642 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.547933 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-scripts\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.548107 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b22c05c-eab7-40a4-bdd9-3c253897979d-logs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.548154 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.548302 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.548341 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b22c05c-eab7-40a4-bdd9-3c253897979d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.548386 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651444 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651499 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtd2x\" (UniqueName: \"kubernetes.io/projected/0b22c05c-eab7-40a4-bdd9-3c253897979d-kube-api-access-xtd2x\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651547 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651644 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-scripts\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651716 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b22c05c-eab7-40a4-bdd9-3c253897979d-logs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651750 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651828 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651871 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b22c05c-eab7-40a4-bdd9-3c253897979d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.651940 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.652213 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b22c05c-eab7-40a4-bdd9-3c253897979d-logs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.652420 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0b22c05c-eab7-40a4-bdd9-3c253897979d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.661084 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.662182 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.671362 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.671387 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-scripts\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.671896 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-config-data-custom\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.672323 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b22c05c-eab7-40a4-bdd9-3c253897979d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.677024 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtd2x\" (UniqueName: \"kubernetes.io/projected/0b22c05c-eab7-40a4-bdd9-3c253897979d-kube-api-access-xtd2x\") pod \"cinder-api-0\" (UID: \"0b22c05c-eab7-40a4-bdd9-3c253897979d\") " pod="openstack/cinder-api-0" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.678236 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514e1b1d-bc66-482c-8b2f-6786671ce639" path="/var/lib/kubelet/pods/514e1b1d-bc66-482c-8b2f-6786671ce639/volumes" Feb 01 06:58:17 crc kubenswrapper[4546]: I0201 06:58:17.792566 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.274628 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.304316 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b22c05c-eab7-40a4-bdd9-3c253897979d","Type":"ContainerStarted","Data":"92f92c9063af8ecbeea00edc6eb543e40bd3fc9b2c688f6055eb6cbb566ec0b6"} Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.686147 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5867f5bb44-shmxj" Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.764489 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.904793 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58cd8b848-kmr5k"] Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.932365 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58cd8b848-kmr5k"] Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.932468 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.934681 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 01 06:58:18 crc kubenswrapper[4546]: I0201 06:58:18.934732 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109164 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cab73c0-d0e4-4a59-a80b-338d4873300b-logs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109216 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-public-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109271 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-combined-ca-bundle\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109286 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-internal-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109320 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data-custom\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109344 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.109389 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twj8n\" (UniqueName: \"kubernetes.io/projected/3cab73c0-d0e4-4a59-a80b-338d4873300b-kube-api-access-twj8n\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.210917 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twj8n\" (UniqueName: \"kubernetes.io/projected/3cab73c0-d0e4-4a59-a80b-338d4873300b-kube-api-access-twj8n\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211261 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cab73c0-d0e4-4a59-a80b-338d4873300b-logs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211304 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-public-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-combined-ca-bundle\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211417 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-internal-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211469 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data-custom\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.211512 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.214559 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cab73c0-d0e4-4a59-a80b-338d4873300b-logs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.220938 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-internal-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.221040 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-combined-ca-bundle\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.221660 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-public-tls-certs\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.225526 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data-custom\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.223381 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cab73c0-d0e4-4a59-a80b-338d4873300b-config-data\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.260024 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twj8n\" (UniqueName: \"kubernetes.io/projected/3cab73c0-d0e4-4a59-a80b-338d4873300b-kube-api-access-twj8n\") pod \"barbican-api-58cd8b848-kmr5k\" (UID: \"3cab73c0-d0e4-4a59-a80b-338d4873300b\") " pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.316503 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b22c05c-eab7-40a4-bdd9-3c253897979d","Type":"ContainerStarted","Data":"1b23ea12d6c87e8a5495cd02f58bc69735a64a6eec7ab6d997e989b66aef7383"} Feb 01 06:58:19 crc kubenswrapper[4546]: I0201 06:58:19.556772 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.018161 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.073624 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.173384 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58cd8b848-kmr5k"] Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.362922 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0b22c05c-eab7-40a4-bdd9-3c253897979d","Type":"ContainerStarted","Data":"a86dc5ac04fc4314b6748ed86c3172914f19b895508762a6947c17b6b1c527a4"} Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.366275 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.370899 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-868f7bb468-rfpkj"] Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.372577 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.381877 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58cd8b848-kmr5k" event={"ID":"3cab73c0-d0e4-4a59-a80b-338d4873300b","Type":"ContainerStarted","Data":"675d70fd754514da70ed01c8e9bed58fd9b095277af6a2146b9bc13961700556"} Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.382623 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-868f7bb468-rfpkj"] Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.404774 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.404760496 podStartE2EDuration="3.404760496s" podCreationTimestamp="2026-02-01 06:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:20.398837891 +0000 UTC m=+931.049773897" watchObservedRunningTime="2026-02-01 06:58:20.404760496 +0000 UTC m=+931.055696512" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.460528 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-public-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.461089 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f275408-af43-4eb5-b5e7-9df52288f7ad-logs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.461666 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-combined-ca-bundle\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.461907 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/9f275408-af43-4eb5-b5e7-9df52288f7ad-kube-api-access-6fkmx\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.462105 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-scripts\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.462541 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-config-data\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.462649 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-internal-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.565438 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-config-data\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.566269 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-internal-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.566422 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-public-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.567607 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f275408-af43-4eb5-b5e7-9df52288f7ad-logs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.567746 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-combined-ca-bundle\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.567883 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/9f275408-af43-4eb5-b5e7-9df52288f7ad-kube-api-access-6fkmx\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.567989 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-scripts\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.568981 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f275408-af43-4eb5-b5e7-9df52288f7ad-logs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.573458 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-config-data\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.575768 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-scripts\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.580356 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-internal-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.585566 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-public-tls-certs\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.586109 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f275408-af43-4eb5-b5e7-9df52288f7ad-combined-ca-bundle\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.591525 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/9f275408-af43-4eb5-b5e7-9df52288f7ad-kube-api-access-6fkmx\") pod \"placement-868f7bb468-rfpkj\" (UID: \"9f275408-af43-4eb5-b5e7-9df52288f7ad\") " pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:20 crc kubenswrapper[4546]: I0201 06:58:20.704179 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.160739 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-868f7bb468-rfpkj"] Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.390394 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-868f7bb468-rfpkj" event={"ID":"9f275408-af43-4eb5-b5e7-9df52288f7ad","Type":"ContainerStarted","Data":"014a494269a5a5cca24237fa1ad4e63f3ffc1118eb09c9441d70aee1a8862352"} Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.393267 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58cd8b848-kmr5k" event={"ID":"3cab73c0-d0e4-4a59-a80b-338d4873300b","Type":"ContainerStarted","Data":"d2549394a40e57bf6adc7b95caacda2fcc5fce49b219d4d85ebec2acd6358281"} Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.393309 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58cd8b848-kmr5k" event={"ID":"3cab73c0-d0e4-4a59-a80b-338d4873300b","Type":"ContainerStarted","Data":"6c0f678e91b816240c08729059e2140d040d6157dccba1071034b86876b42f78"} Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.393424 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.393443 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.420453 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58cd8b848-kmr5k" podStartSLOduration=3.420434134 podStartE2EDuration="3.420434134s" podCreationTimestamp="2026-02-01 06:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:21.413815118 +0000 UTC m=+932.064751124" watchObservedRunningTime="2026-02-01 06:58:21.420434134 +0000 UTC m=+932.071370151" Feb 01 06:58:21 crc kubenswrapper[4546]: I0201 06:58:21.914456 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.173744 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-675d5f5fd9-ptdjf" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.228124 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.228357 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-api" containerID="cri-o://34f1d49f70e8071b576af6dd3ea1f0d6c2447d06e3d60ee0507b0b60fad1400d" gracePeriod=30 Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.228503 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" containerID="cri-o://1dac26e923ee4b658c5bf75a6ffa320740158613f7cf7c7525307a3083e6a354" gracePeriod=30 Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.253130 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.281621 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.282893 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.313971 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.404579 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-868f7bb468-rfpkj" event={"ID":"9f275408-af43-4eb5-b5e7-9df52288f7ad","Type":"ContainerStarted","Data":"7373c9c936db5696ca0e72c3af2949c53ba8df0487224d498f467692eff5b6db"} Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.404619 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-868f7bb468-rfpkj" event={"ID":"9f275408-af43-4eb5-b5e7-9df52288f7ad","Type":"ContainerStarted","Data":"9819ae93bf930048596eb4dcd0df97e67749b5726585e4955e047e554e8e07ac"} Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.405059 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.405080 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407758 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407786 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407810 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407837 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407884 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.407904 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkzsh\" (UniqueName: \"kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.408245 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.423485 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-868f7bb468-rfpkj" podStartSLOduration=2.4234754 podStartE2EDuration="2.4234754s" podCreationTimestamp="2026-02-01 06:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:22.418900376 +0000 UTC m=+933.069836392" watchObservedRunningTime="2026-02-01 06:58:22.4234754 +0000 UTC m=+933.074411417" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.510186 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.510239 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.510293 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.510837 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.511228 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.512057 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkzsh\" (UniqueName: \"kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.515257 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.516364 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.516423 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.516800 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.520266 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.521379 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.528790 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.531468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkzsh\" (UniqueName: \"kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh\") pod \"neutron-76f65868d9-5zt7q\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.591626 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.605544 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:22 crc kubenswrapper[4546]: I0201 06:58:22.677105 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.113037 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.185609 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.185883 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="dnsmasq-dns" containerID="cri-o://00c11f97b4794948dfcc9be71ace0f59d2478f36bb3b9a3006ad675564daddac" gracePeriod=10 Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.381547 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 06:58:23 crc kubenswrapper[4546]: W0201 06:58:23.402256 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c4ef4a8_3d36_4231_a2e8_06f510d1f3e6.slice/crio-fa42d5e7be806b1baf4c7998c69f5d97904289206e922760eec79dfc2ce913ad WatchSource:0}: Error finding container fa42d5e7be806b1baf4c7998c69f5d97904289206e922760eec79dfc2ce913ad: Status 404 returned error can't find the container with id fa42d5e7be806b1baf4c7998c69f5d97904289206e922760eec79dfc2ce913ad Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.424294 4546 generic.go:334] "Generic (PLEG): container finished" podID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerID="1dac26e923ee4b658c5bf75a6ffa320740158613f7cf7c7525307a3083e6a354" exitCode=0 Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.424366 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerDied","Data":"1dac26e923ee4b658c5bf75a6ffa320740158613f7cf7c7525307a3083e6a354"} Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.427954 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerStarted","Data":"fa42d5e7be806b1baf4c7998c69f5d97904289206e922760eec79dfc2ce913ad"} Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.436576 4546 generic.go:334] "Generic (PLEG): container finished" podID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerID="00c11f97b4794948dfcc9be71ace0f59d2478f36bb3b9a3006ad675564daddac" exitCode=0 Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.436622 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" event={"ID":"b364bd0d-fc72-4625-aba3-67afb7c32703","Type":"ContainerDied","Data":"00c11f97b4794948dfcc9be71ace0f59d2478f36bb3b9a3006ad675564daddac"} Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.437182 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="cinder-scheduler" containerID="cri-o://c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f" gracePeriod=30 Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.437780 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="probe" containerID="cri-o://67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5" gracePeriod=30 Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.547460 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6bbbc47dc7-979jx" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.746469 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754163 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754205 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754279 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754318 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754358 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxm7s\" (UniqueName: \"kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.754449 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.800020 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s" (OuterVolumeSpecName: "kube-api-access-jxm7s") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "kube-api-access-jxm7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.856426 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.859396 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.860317 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.865241 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.869113 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") pod \"b364bd0d-fc72-4625-aba3-67afb7c32703\" (UID: \"b364bd0d-fc72-4625-aba3-67afb7c32703\") " Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.870399 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.870485 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.870572 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.870652 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxm7s\" (UniqueName: \"kubernetes.io/projected/b364bd0d-fc72-4625-aba3-67afb7c32703-kube-api-access-jxm7s\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:23 crc kubenswrapper[4546]: W0201 06:58:23.870783 4546 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b364bd0d-fc72-4625-aba3-67afb7c32703/volumes/kubernetes.io~configmap/ovsdbserver-nb Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.870886 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.887238 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config" (OuterVolumeSpecName: "config") pod "b364bd0d-fc72-4625-aba3-67afb7c32703" (UID: "b364bd0d-fc72-4625-aba3-67afb7c32703"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.973089 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:23 crc kubenswrapper[4546]: I0201 06:58:23.973221 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b364bd0d-fc72-4625-aba3-67afb7c32703-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.483699 4546 generic.go:334] "Generic (PLEG): container finished" podID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerID="67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5" exitCode=0 Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.484144 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerDied","Data":"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5"} Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.487794 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerStarted","Data":"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6"} Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.487843 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerStarted","Data":"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe"} Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.488635 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.492809 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.492965 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d668c6fc7-hbl8c" event={"ID":"b364bd0d-fc72-4625-aba3-67afb7c32703","Type":"ContainerDied","Data":"bb8d8edaf2bc30360d63def4a52318e90c684a147c38ca35e5865b8e619fe381"} Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.493109 4546 scope.go:117] "RemoveContainer" containerID="00c11f97b4794948dfcc9be71ace0f59d2478f36bb3b9a3006ad675564daddac" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.542683 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76f65868d9-5zt7q" podStartSLOduration=2.542657705 podStartE2EDuration="2.542657705s" podCreationTimestamp="2026-02-01 06:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:24.512509073 +0000 UTC m=+935.163445089" watchObservedRunningTime="2026-02-01 06:58:24.542657705 +0000 UTC m=+935.193593721" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.551146 4546 scope.go:117] "RemoveContainer" containerID="b17c73d0b08382beb80f681a74a66406b90fb7830c92248557b2cf2134336c4d" Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.568110 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:58:24 crc kubenswrapper[4546]: I0201 06:58:24.569472 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d668c6fc7-hbl8c"] Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.256896 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.321602 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.321689 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.321884 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhfc8\" (UniqueName: \"kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.321912 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.321952 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.322165 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data\") pod \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\" (UID: \"3ae79a33-7ef9-4952-a754-e4a2ece9a771\") " Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.322651 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.347073 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8" (OuterVolumeSpecName: "kube-api-access-qhfc8") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "kube-api-access-qhfc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.356006 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts" (OuterVolumeSpecName: "scripts") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.376839 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.414442 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.420333 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.420438 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.425758 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhfc8\" (UniqueName: \"kubernetes.io/projected/3ae79a33-7ef9-4952-a754-e4a2ece9a771-kube-api-access-qhfc8\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.425781 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.425790 4546 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ae79a33-7ef9-4952-a754-e4a2ece9a771-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.425798 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.425808 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.505083 4546 generic.go:334] "Generic (PLEG): container finished" podID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerID="c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f" exitCode=0 Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.505161 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.505164 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerDied","Data":"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f"} Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.506085 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ae79a33-7ef9-4952-a754-e4a2ece9a771","Type":"ContainerDied","Data":"d6192ee938ca066eeee6cd52764623f66b501e7ca1089c2964f8b57c29fa56ab"} Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.506137 4546 scope.go:117] "RemoveContainer" containerID="67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.519985 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data" (OuterVolumeSpecName: "config-data") pod "3ae79a33-7ef9-4952-a754-e4a2ece9a771" (UID: "3ae79a33-7ef9-4952-a754-e4a2ece9a771"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.530071 4546 scope.go:117] "RemoveContainer" containerID="c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.531063 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ae79a33-7ef9-4952-a754-e4a2ece9a771-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.585784 4546 scope.go:117] "RemoveContainer" containerID="67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5" Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.588984 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5\": container with ID starting with 67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5 not found: ID does not exist" containerID="67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.589114 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5"} err="failed to get container status \"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5\": rpc error: code = NotFound desc = could not find container \"67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5\": container with ID starting with 67e518afef0d42249895d07c49d1dd3adb4850142280a2e396d3b2a3baf88bf5 not found: ID does not exist" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.589215 4546 scope.go:117] "RemoveContainer" containerID="c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f" Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.589484 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f\": container with ID starting with c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f not found: ID does not exist" containerID="c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.589574 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f"} err="failed to get container status \"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f\": rpc error: code = NotFound desc = could not find container \"c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f\": container with ID starting with c81afc503c141686e89b7017aaaaf5e23e8491c28855d7d5bb24408fc45a540f not found: ID does not exist" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.664128 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" path="/var/lib/kubelet/pods/b364bd0d-fc72-4625-aba3-67afb7c32703/volumes" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.839072 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.857964 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.863964 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.864322 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="cinder-scheduler" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864335 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="cinder-scheduler" Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.864343 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="dnsmasq-dns" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864349 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="dnsmasq-dns" Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.864370 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="probe" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864377 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="probe" Feb 01 06:58:25 crc kubenswrapper[4546]: E0201 06:58:25.864391 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="init" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864396 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="init" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864548 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b364bd0d-fc72-4625-aba3-67afb7c32703" containerName="dnsmasq-dns" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864564 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="probe" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.864572 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" containerName="cinder-scheduler" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.865490 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.870448 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.881438 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.941754 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.941970 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgzb2\" (UniqueName: \"kubernetes.io/projected/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-kube-api-access-lgzb2\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.942096 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.942196 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.942284 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:25 crc kubenswrapper[4546]: I0201 06:58:25.942421 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-scripts\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.043760 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgzb2\" (UniqueName: \"kubernetes.io/projected/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-kube-api-access-lgzb2\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.043966 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.044057 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.044153 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.044298 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-scripts\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.044430 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.045188 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.051788 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.052361 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.052832 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-config-data\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.058162 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-scripts\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.066912 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgzb2\" (UniqueName: \"kubernetes.io/projected/f95fa4f2-04ac-4988-a1ed-2f4f0d760b44-kube-api-access-lgzb2\") pod \"cinder-scheduler-0\" (UID: \"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44\") " pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.200851 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 06:58:26 crc kubenswrapper[4546]: I0201 06:58:26.839921 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.178603 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.240917 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.242217 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.259661 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.263271 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.263544 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qjr5w" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.263803 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.380430 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.380512 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config-secret\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.380557 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvk5f\" (UniqueName: \"kubernetes.io/projected/64feee97-62ee-4dd2-a584-3bad4c95165e-kube-api-access-rvk5f\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.380642 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.446751 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.483223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.483427 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.483500 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config-secret\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.483555 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvk5f\" (UniqueName: \"kubernetes.io/projected/64feee97-62ee-4dd2-a584-3bad4c95165e-kube-api-access-rvk5f\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.484731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.498652 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-openstack-config-secret\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.515336 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64feee97-62ee-4dd2-a584-3bad4c95165e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.517356 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvk5f\" (UniqueName: \"kubernetes.io/projected/64feee97-62ee-4dd2-a584-3bad4c95165e-kube-api-access-rvk5f\") pod \"openstackclient\" (UID: \"64feee97-62ee-4dd2-a584-3bad4c95165e\") " pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.603830 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44","Type":"ContainerStarted","Data":"72ac058a22f8411197bc4ce3f1ee542f88ad4a346a2c5c673f79475bfe98a774"} Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.604362 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 06:58:27 crc kubenswrapper[4546]: I0201 06:58:27.679244 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae79a33-7ef9-4952-a754-e4a2ece9a771" path="/var/lib/kubelet/pods/3ae79a33-7ef9-4952-a754-e4a2ece9a771/volumes" Feb 01 06:58:28 crc kubenswrapper[4546]: I0201 06:58:28.173219 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 06:58:28 crc kubenswrapper[4546]: I0201 06:58:28.616069 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"64feee97-62ee-4dd2-a584-3bad4c95165e","Type":"ContainerStarted","Data":"83e45e15a805699e4005e6fee686e431b716c5750b85d79d50770d4ecf137c16"} Feb 01 06:58:28 crc kubenswrapper[4546]: I0201 06:58:28.618213 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44","Type":"ContainerStarted","Data":"b286db4a1828888cae7530f200ba631391dbc61b0708a375a7b0171e126c5f1e"} Feb 01 06:58:28 crc kubenswrapper[4546]: I0201 06:58:28.618389 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f95fa4f2-04ac-4988-a1ed-2f4f0d760b44","Type":"ContainerStarted","Data":"abdb97ac2067976df254433095877e7a3f64f8979ffa65920921af5414671d49"} Feb 01 06:58:28 crc kubenswrapper[4546]: I0201 06:58:28.638517 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.638481614 podStartE2EDuration="3.638481614s" podCreationTimestamp="2026-02-01 06:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:28.635746027 +0000 UTC m=+939.286682043" watchObservedRunningTime="2026-02-01 06:58:28.638481614 +0000 UTC m=+939.289417630" Feb 01 06:58:31 crc kubenswrapper[4546]: I0201 06:58:31.105572 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:58:31 crc kubenswrapper[4546]: I0201 06:58:31.201944 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 06:58:31 crc kubenswrapper[4546]: I0201 06:58:31.690362 4546 generic.go:334] "Generic (PLEG): container finished" podID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerID="34f1d49f70e8071b576af6dd3ea1f0d6c2447d06e3d60ee0507b0b60fad1400d" exitCode=0 Feb 01 06:58:31 crc kubenswrapper[4546]: I0201 06:58:31.690415 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerDied","Data":"34f1d49f70e8071b576af6dd3ea1f0d6c2447d06e3d60ee0507b0b60fad1400d"} Feb 01 06:58:31 crc kubenswrapper[4546]: I0201 06:58:31.798122 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="0b22c05c-eab7-40a4-bdd9-3c253897979d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.094229 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.138611 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.138749 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.138788 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.138825 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.139013 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.139178 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v66v\" (UniqueName: \"kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.139203 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle\") pod \"64ac113d-2149-47d8-8a13-a864cdeff3ee\" (UID: \"64ac113d-2149-47d8-8a13-a864cdeff3ee\") " Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.188944 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v" (OuterVolumeSpecName: "kube-api-access-9v66v") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "kube-api-access-9v66v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.192061 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.241986 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v66v\" (UniqueName: \"kubernetes.io/projected/64ac113d-2149-47d8-8a13-a864cdeff3ee-kube-api-access-9v66v\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.242015 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.280510 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.289967 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config" (OuterVolumeSpecName: "config") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.291389 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.292964 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.297046 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "64ac113d-2149-47d8-8a13-a864cdeff3ee" (UID: "64ac113d-2149-47d8-8a13-a864cdeff3ee"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.344053 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.344082 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.344092 4546 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.344102 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.344112 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ac113d-2149-47d8-8a13-a864cdeff3ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.706551 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbbc47dc7-979jx" event={"ID":"64ac113d-2149-47d8-8a13-a864cdeff3ee","Type":"ContainerDied","Data":"51701b808b3db4b9686dda0998722c5f4919552455165cc265ae46a1ef4b3692"} Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.706604 4546 scope.go:117] "RemoveContainer" containerID="1dac26e923ee4b658c5bf75a6ffa320740158613f7cf7c7525307a3083e6a354" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.706727 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbbc47dc7-979jx" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.742624 4546 scope.go:117] "RemoveContainer" containerID="34f1d49f70e8071b576af6dd3ea1f0d6c2447d06e3d60ee0507b0b60fad1400d" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.746249 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.753054 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bbbc47dc7-979jx"] Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.798994 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="0b22c05c-eab7-40a4-bdd9-3c253897979d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.170:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 06:58:32 crc kubenswrapper[4546]: I0201 06:58:32.885968 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.381718 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:58:33 crc kubenswrapper[4546]: E0201 06:58:33.384249 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.384265 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" Feb 01 06:58:33 crc kubenswrapper[4546]: E0201 06:58:33.384276 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-api" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.384282 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-api" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.384470 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-api" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.384491 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" containerName="neutron-httpd" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.385010 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.389329 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.391344 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-72t6p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.392691 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.429900 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.464388 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.464441 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvj7t\" (UniqueName: \"kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.464582 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.464890 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.567427 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.567479 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.567514 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvj7t\" (UniqueName: \"kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.567637 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.598639 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.599921 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.615438 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.638398 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvj7t\" (UniqueName: \"kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t\") pod \"heat-engine-66c6d5d4cd-sncfn\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.709726 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ac113d-2149-47d8-8a13-a864cdeff3ee" path="/var/lib/kubelet/pods/64ac113d-2149-47d8-8a13-a864cdeff3ee/volumes" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.710881 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.724953 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.725187 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.747687 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.849845 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.849934 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.849978 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.850043 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dpjk\" (UniqueName: \"kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.850067 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.850145 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.883473 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.885105 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.891160 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.916605 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.917760 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.921492 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.927197 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.951309 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.952807 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dpjk\" (UniqueName: \"kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.952874 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.952968 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.953124 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.953192 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.953226 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.954225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.954235 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.954765 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.955156 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.955346 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:33 crc kubenswrapper[4546]: I0201 06:58:33.988769 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dpjk\" (UniqueName: \"kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk\") pod \"dnsmasq-dns-7b84f76f59-qvk5p\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055372 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4m5\" (UniqueName: \"kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055537 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055659 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb68s\" (UniqueName: \"kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055694 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055733 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055750 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055783 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.055894 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157673 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb68s\" (UniqueName: \"kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157743 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157785 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157816 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157845 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.157941 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.158007 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4m5\" (UniqueName: \"kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.158116 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.169436 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.203007 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.207905 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.210670 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.211964 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.212141 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.213231 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.215014 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4m5\" (UniqueName: \"kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5\") pod \"heat-api-54cfd8747b-fjphl\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.215580 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb68s\" (UniqueName: \"kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s\") pod \"heat-cfnapi-6c4554789c-fj5bj\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.237174 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58cd8b848-kmr5k" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.275455 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.305108 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.329953 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.330527 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fbbc88988-qj7hz" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api" containerID="cri-o://ef63fcb83333d4554b5d22233f18fab3499ef67fdecb9a66a44a0fd3cc78e3b4" gracePeriod=30 Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.330225 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fbbc88988-qj7hz" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" containerID="cri-o://6a1ed3909dc2ba2f0aab65e710c5ca7444f2519e140471d1df2c9537f4df77e3" gracePeriod=30 Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.370082 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-fbbc88988-qj7hz" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": EOF" Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.517906 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:58:34 crc kubenswrapper[4546]: W0201 06:58:34.522452 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea10db39_8540_4ff0_9a34_859b497605a9.slice/crio-25bf796d0a21ce13e558b8b2665e113fcc75bcb0448e08897fe02bc22429ae22 WatchSource:0}: Error finding container 25bf796d0a21ce13e558b8b2665e113fcc75bcb0448e08897fe02bc22429ae22: Status 404 returned error can't find the container with id 25bf796d0a21ce13e558b8b2665e113fcc75bcb0448e08897fe02bc22429ae22 Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.817055 4546 generic.go:334] "Generic (PLEG): container finished" podID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerID="6a1ed3909dc2ba2f0aab65e710c5ca7444f2519e140471d1df2c9537f4df77e3" exitCode=143 Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.817377 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerDied","Data":"6a1ed3909dc2ba2f0aab65e710c5ca7444f2519e140471d1df2c9537f4df77e3"} Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.826023 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66c6d5d4cd-sncfn" event={"ID":"ea10db39-8540-4ff0-9a34-859b497605a9","Type":"ContainerStarted","Data":"25bf796d0a21ce13e558b8b2665e113fcc75bcb0448e08897fe02bc22429ae22"} Feb 01 06:58:34 crc kubenswrapper[4546]: I0201 06:58:34.848177 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 06:58:34 crc kubenswrapper[4546]: W0201 06:58:34.858720 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded323bc5_e1f8_4472_87c9_cfa65bbdcbe1.slice/crio-5550bc567837eb2f0bb0f129e985a10ba4e034f0538fba7c5d42604f8d3c8521 WatchSource:0}: Error finding container 5550bc567837eb2f0bb0f129e985a10ba4e034f0538fba7c5d42604f8d3c8521: Status 404 returned error can't find the container with id 5550bc567837eb2f0bb0f129e985a10ba4e034f0538fba7c5d42604f8d3c8521 Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.046441 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.196900 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.760094 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6879d57769-6vpbr"] Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.762209 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.771938 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.772094 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.772650 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.786511 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6879d57769-6vpbr"] Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.909781 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" event={"ID":"318d436c-d22c-415f-b171-66fa9901140f","Type":"ContainerStarted","Data":"2702f70335f19ca1f576d8bc4b2daa8cfadcd852b05b70a67a5fc842f4b27227"} Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.911671 4546 generic.go:334] "Generic (PLEG): container finished" podID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerID="4cc7281af917c039e5d2941c69756a0cdf60fb972929b5d2467215059aaac380" exitCode=0 Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.911906 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" event={"ID":"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1","Type":"ContainerDied","Data":"4cc7281af917c039e5d2941c69756a0cdf60fb972929b5d2467215059aaac380"} Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.912007 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" event={"ID":"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1","Type":"ContainerStarted","Data":"5550bc567837eb2f0bb0f129e985a10ba4e034f0538fba7c5d42604f8d3c8521"} Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.920289 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-config-data\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.920375 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-public-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.921056 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-run-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.921137 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-etc-swift\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.921203 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85r8w\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-kube-api-access-85r8w\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.921972 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-internal-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.922093 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-combined-ca-bundle\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.922955 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-log-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.928807 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66c6d5d4cd-sncfn" event={"ID":"ea10db39-8540-4ff0-9a34-859b497605a9","Type":"ContainerStarted","Data":"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2"} Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.929104 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.947261 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-66c6d5d4cd-sncfn" podStartSLOduration=2.9472396610000002 podStartE2EDuration="2.947239661s" podCreationTimestamp="2026-02-01 06:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:35.946265215 +0000 UTC m=+946.597201230" watchObservedRunningTime="2026-02-01 06:58:35.947239661 +0000 UTC m=+946.598175678" Feb 01 06:58:35 crc kubenswrapper[4546]: I0201 06:58:35.949172 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54cfd8747b-fjphl" event={"ID":"c41354a7-2b2f-4f4e-beb8-940543ae2e44","Type":"ContainerStarted","Data":"d942b0c8c0c63c8c755644df1d4a0f148b094842ba7646c6a923282f5878147d"} Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.025725 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-run-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.025894 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-etc-swift\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.025975 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85r8w\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-kube-api-access-85r8w\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.026092 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-internal-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.026136 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-combined-ca-bundle\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.026197 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-log-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.026480 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-config-data\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.026535 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-public-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.027686 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-log-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.033292 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67372e25-2703-4eb4-8c5b-fb92083c3a0e-run-httpd\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.042749 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-combined-ca-bundle\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.051554 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85r8w\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-kube-api-access-85r8w\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.062067 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-public-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.075112 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/67372e25-2703-4eb4-8c5b-fb92083c3a0e-etc-swift\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.076147 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-config-data\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.084778 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67372e25-2703-4eb4-8c5b-fb92083c3a0e-internal-tls-certs\") pod \"swift-proxy-6879d57769-6vpbr\" (UID: \"67372e25-2703-4eb4-8c5b-fb92083c3a0e\") " pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.400812 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.780714 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.979092 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" event={"ID":"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1","Type":"ContainerStarted","Data":"85282ebf44055d02053ca601be216e4ae8e1c8b7a1b5dc69caffdb56b55e2400"} Feb 01 06:58:36 crc kubenswrapper[4546]: I0201 06:58:36.979579 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:37 crc kubenswrapper[4546]: I0201 06:58:37.000943 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" podStartSLOduration=4.000930085 podStartE2EDuration="4.000930085s" podCreationTimestamp="2026-02-01 06:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:36.999286667 +0000 UTC m=+947.650222683" watchObservedRunningTime="2026-02-01 06:58:37.000930085 +0000 UTC m=+947.651866111" Feb 01 06:58:37 crc kubenswrapper[4546]: I0201 06:58:37.105019 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 01 06:58:37 crc kubenswrapper[4546]: I0201 06:58:37.331479 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6879d57769-6vpbr"] Feb 01 06:58:38 crc kubenswrapper[4546]: I0201 06:58:38.012002 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6879d57769-6vpbr" event={"ID":"67372e25-2703-4eb4-8c5b-fb92083c3a0e","Type":"ContainerStarted","Data":"f2f0c494d08b22319d0cfb28b79fa52eb8f79a97ad57a0c796cd97c8303882f3"} Feb 01 06:58:38 crc kubenswrapper[4546]: I0201 06:58:38.012338 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6879d57769-6vpbr" event={"ID":"67372e25-2703-4eb4-8c5b-fb92083c3a0e","Type":"ContainerStarted","Data":"519b2190190b0d033184073d3ba285fd597707b52dbbd5e05855a46ded68c19a"} Feb 01 06:58:38 crc kubenswrapper[4546]: I0201 06:58:38.321121 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-fbbc88988-qj7hz" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": dial tcp 10.217.0.169:9311: connect: connection refused" Feb 01 06:58:38 crc kubenswrapper[4546]: I0201 06:58:38.321754 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-fbbc88988-qj7hz" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": dial tcp 10.217.0.169:9311: connect: connection refused" Feb 01 06:58:39 crc kubenswrapper[4546]: I0201 06:58:39.096468 4546 generic.go:334] "Generic (PLEG): container finished" podID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerID="ef63fcb83333d4554b5d22233f18fab3499ef67fdecb9a66a44a0fd3cc78e3b4" exitCode=0 Feb 01 06:58:39 crc kubenswrapper[4546]: I0201 06:58:39.096697 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerDied","Data":"ef63fcb83333d4554b5d22233f18fab3499ef67fdecb9a66a44a0fd3cc78e3b4"} Feb 01 06:58:39 crc kubenswrapper[4546]: I0201 06:58:39.957242 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.073526 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle\") pod \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.073620 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data\") pod \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.073692 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom\") pod \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.073818 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7lmj\" (UniqueName: \"kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj\") pod \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.073867 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs\") pod \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\" (UID: \"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd\") " Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.074579 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs" (OuterVolumeSpecName: "logs") pod "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" (UID: "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.082930 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" (UID: "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.082953 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj" (OuterVolumeSpecName: "kube-api-access-r7lmj") pod "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" (UID: "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd"). InnerVolumeSpecName "kube-api-access-r7lmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.133140 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fbbc88988-qj7hz" event={"ID":"552e7895-1c0c-4bd0-a3bb-e7ecc50331cd","Type":"ContainerDied","Data":"b8b5cfa9bbb5327165ea2c646588515a8b8cc86a25bafb6b356d4ef6b1f56af1"} Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.133193 4546 scope.go:117] "RemoveContainer" containerID="ef63fcb83333d4554b5d22233f18fab3499ef67fdecb9a66a44a0fd3cc78e3b4" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.133327 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fbbc88988-qj7hz" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.145695 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" (UID: "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.180602 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.180713 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7lmj\" (UniqueName: \"kubernetes.io/projected/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-kube-api-access-r7lmj\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.180792 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.180897 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.205233 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data" (OuterVolumeSpecName: "config-data") pod "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" (UID: "552e7895-1c0c-4bd0-a3bb-e7ecc50331cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.283421 4546 scope.go:117] "RemoveContainer" containerID="6a1ed3909dc2ba2f0aab65e710c5ca7444f2519e140471d1df2c9537f4df77e3" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.285565 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.467361 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:40 crc kubenswrapper[4546]: I0201 06:58:40.477208 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-fbbc88988-qj7hz"] Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.167049 4546 generic.go:334] "Generic (PLEG): container finished" podID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerID="cc02b6b9ba589cae973a7abffbbd6564dd4c6e4bdba7743789ceaad408f98e15" exitCode=137 Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.167109 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerDied","Data":"cc02b6b9ba589cae973a7abffbbd6564dd4c6e4bdba7743789ceaad408f98e15"} Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.169279 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54cfd8747b-fjphl" event={"ID":"c41354a7-2b2f-4f4e-beb8-940543ae2e44","Type":"ContainerStarted","Data":"8b04e1d2ab0d6575c5f725ba50cb7d269417fea997ede8f7a256fba765f888c3"} Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.169931 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.173280 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" event={"ID":"318d436c-d22c-415f-b171-66fa9901140f","Type":"ContainerStarted","Data":"60575e99805c928ba337da0b76e9abc766c7dce1a910069b7ec3f4804b90d05f"} Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.173716 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.179292 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6879d57769-6vpbr" event={"ID":"67372e25-2703-4eb4-8c5b-fb92083c3a0e","Type":"ContainerStarted","Data":"8899e4eee9f2be2769ea7351cacb7dd738aaebf8699be56a1b3d8c741629e91a"} Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.179762 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.179793 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.189398 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-54cfd8747b-fjphl" podStartSLOduration=3.537946738 podStartE2EDuration="8.189389339s" podCreationTimestamp="2026-02-01 06:58:33 +0000 UTC" firstStartedPulling="2026-02-01 06:58:35.209275811 +0000 UTC m=+945.860211827" lastFinishedPulling="2026-02-01 06:58:39.860718422 +0000 UTC m=+950.511654428" observedRunningTime="2026-02-01 06:58:41.185920371 +0000 UTC m=+951.836856386" watchObservedRunningTime="2026-02-01 06:58:41.189389339 +0000 UTC m=+951.840325356" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.213577 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6879d57769-6vpbr" podStartSLOduration=6.2135547540000005 podStartE2EDuration="6.213554754s" podCreationTimestamp="2026-02-01 06:58:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:41.212941938 +0000 UTC m=+951.863877954" watchObservedRunningTime="2026-02-01 06:58:41.213554754 +0000 UTC m=+951.864490770" Feb 01 06:58:41 crc kubenswrapper[4546]: I0201 06:58:41.665495 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" path="/var/lib/kubelet/pods/552e7895-1c0c-4bd0-a3bb-e7ecc50331cd/volumes" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.004019 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" podStartSLOduration=4.231173356 podStartE2EDuration="9.0040017s" podCreationTimestamp="2026-02-01 06:58:33 +0000 UTC" firstStartedPulling="2026-02-01 06:58:35.08187035 +0000 UTC m=+945.732806366" lastFinishedPulling="2026-02-01 06:58:39.854698694 +0000 UTC m=+950.505634710" observedRunningTime="2026-02-01 06:58:41.235273676 +0000 UTC m=+951.886209692" watchObservedRunningTime="2026-02-01 06:58:42.0040017 +0000 UTC m=+952.654937716" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.010580 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-76d966855c-lglrc"] Feb 01 06:58:42 crc kubenswrapper[4546]: E0201 06:58:42.010970 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.010989 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" Feb 01 06:58:42 crc kubenswrapper[4546]: E0201 06:58:42.011006 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.011012 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.011175 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.011195 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="552e7895-1c0c-4bd0-a3bb-e7ecc50331cd" containerName="barbican-api-log" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.011808 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.028069 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76d966855c-lglrc"] Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.093506 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.094990 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.117471 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.118614 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.133812 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwrk\" (UniqueName: \"kubernetes.io/projected/358b4015-270f-4f22-918f-8c12b60603a3-kube-api-access-wgwrk\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.133900 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.133955 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.134032 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmww\" (UniqueName: \"kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.134104 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.134260 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.134302 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-combined-ca-bundle\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.134337 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data-custom\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.142058 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.148087 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.197831 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6879d57769-6vpbr" podUID="67372e25-2703-4eb4-8c5b-fb92083c3a0e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235657 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235731 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmww\" (UniqueName: \"kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235776 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235843 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235879 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235905 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqtz9\" (UniqueName: \"kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.235996 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.236025 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-combined-ca-bundle\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.236061 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data-custom\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.236082 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.236188 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwrk\" (UniqueName: \"kubernetes.io/projected/358b4015-270f-4f22-918f-8c12b60603a3-kube-api-access-wgwrk\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.236215 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.254662 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.258674 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.258894 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-combined-ca-bundle\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.261967 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.265729 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwrk\" (UniqueName: \"kubernetes.io/projected/358b4015-270f-4f22-918f-8c12b60603a3-kube-api-access-wgwrk\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.265905 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data-custom\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.266348 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmww\" (UniqueName: \"kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww\") pod \"heat-api-6d9df55579-82dts\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.292947 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/358b4015-270f-4f22-918f-8c12b60603a3-config-data\") pod \"heat-engine-76d966855c-lglrc\" (UID: \"358b4015-270f-4f22-918f-8c12b60603a3\") " pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.341817 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.341950 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.341982 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqtz9\" (UniqueName: \"kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.342134 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.343422 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.360793 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqtz9\" (UniqueName: \"kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.362502 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.364468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.378324 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") pod \"heat-cfnapi-54876bc7f7-6wwtv\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.409952 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-6879d57769-6vpbr" podUID="67372e25-2703-4eb4-8c5b-fb92083c3a0e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.435990 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:42 crc kubenswrapper[4546]: I0201 06:58:42.452709 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:43 crc kubenswrapper[4546]: I0201 06:58:43.247388 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.172040 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.241061 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.272606 4546 generic.go:334] "Generic (PLEG): container finished" podID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerID="7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce" exitCode=137 Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.272816 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" podUID="318d436c-d22c-415f-b171-66fa9901140f" containerName="heat-cfnapi" containerID="cri-o://60575e99805c928ba337da0b76e9abc766c7dce1a910069b7ec3f4804b90d05f" gracePeriod=60 Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.273163 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerDied","Data":"7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce"} Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.283716 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.283979 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-54cfd8747b-fjphl" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerName="heat-api" containerID="cri-o://8b04e1d2ab0d6575c5f725ba50cb7d269417fea997ede8f7a256fba765f888c3" gracePeriod=60 Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.306893 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.307136 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="dnsmasq-dns" containerID="cri-o://c01ec8ed21578560d39dd4afe1e10dbffedb3b6af61d452e12535b725db11c5a" gracePeriod=10 Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.340885 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-8677d45756-xk5bt"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.342394 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.346288 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.351215 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.366222 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-667c96b6cb-hf7jb"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.367728 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.370485 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.380732 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.381042 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-667c96b6cb-hf7jb"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.389643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-public-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.389709 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.389900 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data-custom\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.390021 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-combined-ca-bundle\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.390118 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnsg\" (UniqueName: \"kubernetes.io/projected/0e626c44-78bc-403d-98cd-2a6b09ab189e-kube-api-access-pnnsg\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.390202 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-internal-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.402928 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8677d45756-xk5bt"] Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492274 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-public-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492338 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-internal-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492359 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data-custom\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492378 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492397 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72zp\" (UniqueName: \"kubernetes.io/projected/2bde2be3-e30f-4116-bf00-2e6816dd43dc-kube-api-access-v72zp\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492441 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-public-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492468 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492527 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data-custom\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492558 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-combined-ca-bundle\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492586 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-internal-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492625 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-combined-ca-bundle\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.492670 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnsg\" (UniqueName: \"kubernetes.io/projected/0e626c44-78bc-403d-98cd-2a6b09ab189e-kube-api-access-pnnsg\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.504477 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-combined-ca-bundle\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.506844 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.509177 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-internal-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.509315 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-public-tls-certs\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.509347 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e626c44-78bc-403d-98cd-2a6b09ab189e-config-data-custom\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.512273 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnsg\" (UniqueName: \"kubernetes.io/projected/0e626c44-78bc-403d-98cd-2a6b09ab189e-kube-api-access-pnnsg\") pod \"heat-api-8677d45756-xk5bt\" (UID: \"0e626c44-78bc-403d-98cd-2a6b09ab189e\") " pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594184 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-combined-ca-bundle\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594228 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-internal-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594295 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-public-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594320 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data-custom\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594335 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.594351 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72zp\" (UniqueName: \"kubernetes.io/projected/2bde2be3-e30f-4116-bf00-2e6816dd43dc-kube-api-access-v72zp\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.599544 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.600156 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-internal-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.600891 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-public-tls-certs\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.605624 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-combined-ca-bundle\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.610957 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2bde2be3-e30f-4116-bf00-2e6816dd43dc-config-data-custom\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.611455 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72zp\" (UniqueName: \"kubernetes.io/projected/2bde2be3-e30f-4116-bf00-2e6816dd43dc-kube-api-access-v72zp\") pod \"heat-cfnapi-667c96b6cb-hf7jb\" (UID: \"2bde2be3-e30f-4116-bf00-2e6816dd43dc\") " pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.682151 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:44 crc kubenswrapper[4546]: I0201 06:58:44.695743 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.289032 4546 generic.go:334] "Generic (PLEG): container finished" podID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerID="8b04e1d2ab0d6575c5f725ba50cb7d269417fea997ede8f7a256fba765f888c3" exitCode=0 Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.289132 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54cfd8747b-fjphl" event={"ID":"c41354a7-2b2f-4f4e-beb8-940543ae2e44","Type":"ContainerDied","Data":"8b04e1d2ab0d6575c5f725ba50cb7d269417fea997ede8f7a256fba765f888c3"} Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.299281 4546 generic.go:334] "Generic (PLEG): container finished" podID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerID="c01ec8ed21578560d39dd4afe1e10dbffedb3b6af61d452e12535b725db11c5a" exitCode=0 Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.299346 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" event={"ID":"95418b3b-b693-4b25-8ce8-967d233a1e54","Type":"ContainerDied","Data":"c01ec8ed21578560d39dd4afe1e10dbffedb3b6af61d452e12535b725db11c5a"} Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.301590 4546 generic.go:334] "Generic (PLEG): container finished" podID="318d436c-d22c-415f-b171-66fa9901140f" containerID="60575e99805c928ba337da0b76e9abc766c7dce1a910069b7ec3f4804b90d05f" exitCode=0 Feb 01 06:58:45 crc kubenswrapper[4546]: I0201 06:58:45.301622 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" event={"ID":"318d436c-d22c-415f-b171-66fa9901140f","Type":"ContainerDied","Data":"60575e99805c928ba337da0b76e9abc766c7dce1a910069b7ec3f4804b90d05f"} Feb 01 06:58:46 crc kubenswrapper[4546]: I0201 06:58:46.413550 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6879d57769-6vpbr" Feb 01 06:58:48 crc kubenswrapper[4546]: I0201 06:58:48.111849 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Feb 01 06:58:49 crc kubenswrapper[4546]: I0201 06:58:49.278911 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" podUID="318d436c-d22c-415f-b171-66fa9901140f" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.178:8000/healthcheck\": dial tcp 10.217.0.178:8000: connect: connection refused" Feb 01 06:58:49 crc kubenswrapper[4546]: I0201 06:58:49.306631 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-54cfd8747b-fjphl" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.179:8004/healthcheck\": dial tcp 10.217.0.179:8004: connect: connection refused" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.063471 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.168828 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom\") pod \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.169060 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx4m5\" (UniqueName: \"kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5\") pod \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.169129 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data\") pod \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.169190 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle\") pod \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\" (UID: \"c41354a7-2b2f-4f4e-beb8-940543ae2e44\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.180243 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c41354a7-2b2f-4f4e-beb8-940543ae2e44" (UID: "c41354a7-2b2f-4f4e-beb8-940543ae2e44"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.185844 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5" (OuterVolumeSpecName: "kube-api-access-rx4m5") pod "c41354a7-2b2f-4f4e-beb8-940543ae2e44" (UID: "c41354a7-2b2f-4f4e-beb8-940543ae2e44"). InnerVolumeSpecName "kube-api-access-rx4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.204749 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41354a7-2b2f-4f4e-beb8-940543ae2e44" (UID: "c41354a7-2b2f-4f4e-beb8-940543ae2e44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.284120 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.284148 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx4m5\" (UniqueName: \"kubernetes.io/projected/c41354a7-2b2f-4f4e-beb8-940543ae2e44-kube-api-access-rx4m5\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.284161 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.287046 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data" (OuterVolumeSpecName: "config-data") pod "c41354a7-2b2f-4f4e-beb8-940543ae2e44" (UID: "c41354a7-2b2f-4f4e-beb8-940543ae2e44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.346702 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.349602 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.367056 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.383073 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" event={"ID":"318d436c-d22c-415f-b171-66fa9901140f","Type":"ContainerDied","Data":"2702f70335f19ca1f576d8bc4b2daa8cfadcd852b05b70a67a5fc842f4b27227"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.383136 4546 scope.go:117] "RemoveContainer" containerID="60575e99805c928ba337da0b76e9abc766c7dce1a910069b7ec3f4804b90d05f" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.383263 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c4554789c-fj5bj" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.387434 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41354a7-2b2f-4f4e-beb8-940543ae2e44-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.397673 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"64feee97-62ee-4dd2-a584-3bad4c95165e","Type":"ContainerStarted","Data":"6e0b7c881b3e43cdda23fc1ba1afe8eee80b86f1d28dadb6862e58def24d416b"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.412019 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.412163 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd1d825a-ca7c-4a01-9f10-52876f202ef6","Type":"ContainerDied","Data":"e5882db122c3bf6a7b1aaede3a104dc53ba6a06ceade3323ee4c5184a60859d6"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.434576 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon-log" containerID="cri-o://1dc96d1f38420507484550dc5fea604ec7287d0ef1855005f7600d236c468b7c" gracePeriod=30 Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.434680 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerStarted","Data":"1defa9cde24bd7fa205a1d242ff1e49e8126f438e7db916597ed746325e80d73"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.434741 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c8bd8cd6b-vfjlr" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" containerID="cri-o://1defa9cde24bd7fa205a1d242ff1e49e8126f438e7db916597ed746325e80d73" gracePeriod=30 Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.448485 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-54cfd8747b-fjphl" event={"ID":"c41354a7-2b2f-4f4e-beb8-940543ae2e44","Type":"ContainerDied","Data":"d942b0c8c0c63c8c755644df1d4a0f148b094842ba7646c6a923282f5878147d"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.448629 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-54cfd8747b-fjphl" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.449839 4546 scope.go:117] "RemoveContainer" containerID="cc02b6b9ba589cae973a7abffbbd6564dd4c6e4bdba7743789ceaad408f98e15" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.455825 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.919427472 podStartE2EDuration="23.45580875s" podCreationTimestamp="2026-02-01 06:58:27 +0000 UTC" firstStartedPulling="2026-02-01 06:58:28.209768971 +0000 UTC m=+938.860704987" lastFinishedPulling="2026-02-01 06:58:49.746150249 +0000 UTC m=+960.397086265" observedRunningTime="2026-02-01 06:58:50.450145275 +0000 UTC m=+961.101081291" watchObservedRunningTime="2026-02-01 06:58:50.45580875 +0000 UTC m=+961.106744767" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.477082 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" event={"ID":"95418b3b-b693-4b25-8ce8-967d233a1e54","Type":"ContainerDied","Data":"886881be3d1b8be6b17f7ee9aa05deef182eec19eb06d797cb876a80f2207cdd"} Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.477210 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f9ff4476f-89c94" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488049 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488097 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488206 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjmdd\" (UniqueName: \"kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488309 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488330 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488353 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle\") pod \"318d436c-d22c-415f-b171-66fa9901140f\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488384 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488413 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488430 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data\") pod \"318d436c-d22c-415f-b171-66fa9901140f\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488469 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488506 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb68s\" (UniqueName: \"kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s\") pod \"318d436c-d22c-415f-b171-66fa9901140f\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488544 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom\") pod \"318d436c-d22c-415f-b171-66fa9901140f\" (UID: \"318d436c-d22c-415f-b171-66fa9901140f\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488614 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488651 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fqct\" (UniqueName: \"kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488708 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488756 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle\") pod \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\" (UID: \"dd1d825a-ca7c-4a01-9f10-52876f202ef6\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.488804 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0\") pod \"95418b3b-b693-4b25-8ce8-967d233a1e54\" (UID: \"95418b3b-b693-4b25-8ce8-967d233a1e54\") " Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.489538 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.497175 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd" (OuterVolumeSpecName: "kube-api-access-xjmdd") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "kube-api-access-xjmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.499542 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s" (OuterVolumeSpecName: "kube-api-access-bb68s") pod "318d436c-d22c-415f-b171-66fa9901140f" (UID: "318d436c-d22c-415f-b171-66fa9901140f"). InnerVolumeSpecName "kube-api-access-bb68s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.509658 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.526239 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "318d436c-d22c-415f-b171-66fa9901140f" (UID: "318d436c-d22c-415f-b171-66fa9901140f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.536021 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.536560 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct" (OuterVolumeSpecName: "kube-api-access-6fqct") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "kube-api-access-6fqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.543034 4546 scope.go:117] "RemoveContainer" containerID="fe732eac3b0b024b973f4d60a23efb8d9e2182a1699c6d4a0b204cf9c53035e4" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.543260 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts" (OuterVolumeSpecName: "scripts") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.558220 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-54cfd8747b-fjphl"] Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.572936 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.584502 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "318d436c-d22c-415f-b171-66fa9901140f" (UID: "318d436c-d22c-415f-b171-66fa9901140f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610488 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjmdd\" (UniqueName: \"kubernetes.io/projected/95418b3b-b693-4b25-8ce8-967d233a1e54-kube-api-access-xjmdd\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610512 4546 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610523 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610532 4546 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610543 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610551 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb68s\" (UniqueName: \"kubernetes.io/projected/318d436c-d22c-415f-b171-66fa9901140f-kube-api-access-bb68s\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610559 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610568 4546 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd1d825a-ca7c-4a01-9f10-52876f202ef6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.610576 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fqct\" (UniqueName: \"kubernetes.io/projected/dd1d825a-ca7c-4a01-9f10-52876f202ef6-kube-api-access-6fqct\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.668869 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config" (OuterVolumeSpecName: "config") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.671253 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.683526 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data" (OuterVolumeSpecName: "config-data") pod "318d436c-d22c-415f-b171-66fa9901140f" (UID: "318d436c-d22c-415f-b171-66fa9901140f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.706994 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.726529 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.726596 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/318d436c-d22c-415f-b171-66fa9901140f-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.726611 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.726818 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.730520 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.803104 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.803629 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95418b3b-b693-4b25-8ce8-967d233a1e54" (UID: "95418b3b-b693-4b25-8ce8-967d233a1e54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.809097 4546 scope.go:117] "RemoveContainer" containerID="4a5f15bd1d7835c016f46c2f196f9d2d2ae66c2104844c833cfa3d78a502e4a4" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.835102 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.835129 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.835138 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95418b3b-b693-4b25-8ce8-967d233a1e54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.865603 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data" (OuterVolumeSpecName: "config-data") pod "dd1d825a-ca7c-4a01-9f10-52876f202ef6" (UID: "dd1d825a-ca7c-4a01-9f10-52876f202ef6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.888974 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-667c96b6cb-hf7jb"] Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.890900 4546 scope.go:117] "RemoveContainer" containerID="07590c57da60555fe686858a2df6c9fc569ea928439e69fe7aecfb572f0003eb" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.897978 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8677d45756-xk5bt"] Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.920076 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.936352 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd1d825a-ca7c-4a01-9f10-52876f202ef6-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.954338 4546 scope.go:117] "RemoveContainer" containerID="8b04e1d2ab0d6575c5f725ba50cb7d269417fea997ede8f7a256fba765f888c3" Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.954486 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76d966855c-lglrc"] Feb 01 06:58:50 crc kubenswrapper[4546]: W0201 06:58:50.967401 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod358b4015_270f_4f22_918f_8c12b60603a3.slice/crio-f42a195674b2e32cd0605473d9c0be7d1c85bc3e532d844ba9eac716fa76d817 WatchSource:0}: Error finding container f42a195674b2e32cd0605473d9c0be7d1c85bc3e532d844ba9eac716fa76d817: Status 404 returned error can't find the container with id f42a195674b2e32cd0605473d9c0be7d1c85bc3e532d844ba9eac716fa76d817 Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.974246 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:58:50 crc kubenswrapper[4546]: W0201 06:58:50.980193 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e9aee44_bbac_4f06_8187_cad533ab8a87.slice/crio-c4410971f6b899373326ef879474711509032a387b35892628a324bc98c2f780 WatchSource:0}: Error finding container c4410971f6b899373326ef879474711509032a387b35892628a324bc98c2f780: Status 404 returned error can't find the container with id c4410971f6b899373326ef879474711509032a387b35892628a324bc98c2f780 Feb 01 06:58:50 crc kubenswrapper[4546]: I0201 06:58:50.997744 4546 scope.go:117] "RemoveContainer" containerID="c01ec8ed21578560d39dd4afe1e10dbffedb3b6af61d452e12535b725db11c5a" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.048152 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.058010 4546 scope.go:117] "RemoveContainer" containerID="fce29137df87e1ed3a51aaedb21353046645dcb491faa2db5f415d99a8f6b3d8" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.073991 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6c4554789c-fj5bj"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.104175 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.147305 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171132 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171674 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="sg-core" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171695 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="sg-core" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171709 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerName="heat-api" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171715 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerName="heat-api" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171744 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="dnsmasq-dns" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171751 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="dnsmasq-dns" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171761 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-central-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171767 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-central-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171784 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-notification-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171792 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-notification-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171825 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="init" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171831 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="init" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171847 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="proxy-httpd" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171852 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="proxy-httpd" Feb 01 06:58:51 crc kubenswrapper[4546]: E0201 06:58:51.171885 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318d436c-d22c-415f-b171-66fa9901140f" containerName="heat-cfnapi" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.171891 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="318d436c-d22c-415f-b171-66fa9901140f" containerName="heat-cfnapi" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172093 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="proxy-httpd" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172122 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-central-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172132 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="sg-core" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172144 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" containerName="dnsmasq-dns" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172154 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" containerName="heat-api" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172163 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="318d436c-d22c-415f-b171-66fa9901140f" containerName="heat-cfnapi" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.172174 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" containerName="ceilometer-notification-agent" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.174592 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.175698 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.180102 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.180304 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.203477 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.212511 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f9ff4476f-89c94"] Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.351780 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmgq\" (UniqueName: \"kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.351909 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.351975 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.352118 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.352291 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.352489 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.352633 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.454730 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmgq\" (UniqueName: \"kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.454881 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.454922 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.455226 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.455828 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.456418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.456511 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.457107 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.457512 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.467438 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.468746 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.468769 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.470962 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.474256 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmgq\" (UniqueName: \"kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq\") pod \"ceilometer-0\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.495838 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.500073 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" event={"ID":"e1c27e30-fbbd-41e2-90d4-142797e326c8","Type":"ContainerStarted","Data":"7ecd769b6a1e46d990025bef1861dae972f622fb372e472d402868787dc49d23"} Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.504557 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76d966855c-lglrc" event={"ID":"358b4015-270f-4f22-918f-8c12b60603a3","Type":"ContainerStarted","Data":"f42a195674b2e32cd0605473d9c0be7d1c85bc3e532d844ba9eac716fa76d817"} Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.509883 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8677d45756-xk5bt" event={"ID":"0e626c44-78bc-403d-98cd-2a6b09ab189e","Type":"ContainerStarted","Data":"deeb66a875e5012b0b2caa82bc066bb93bf6abd30427ff1bcbd20cd1abf75c17"} Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.513149 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d9df55579-82dts" event={"ID":"3e9aee44-bbac-4f06-8187-cad533ab8a87","Type":"ContainerStarted","Data":"c4410971f6b899373326ef879474711509032a387b35892628a324bc98c2f780"} Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.524693 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" event={"ID":"2bde2be3-e30f-4116-bf00-2e6816dd43dc","Type":"ContainerStarted","Data":"f189e44204a58301d52f783d13d07ae739c5830dbb83c494f00436c34edb9622"} Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.685079 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318d436c-d22c-415f-b171-66fa9901140f" path="/var/lib/kubelet/pods/318d436c-d22c-415f-b171-66fa9901140f/volumes" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.688784 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95418b3b-b693-4b25-8ce8-967d233a1e54" path="/var/lib/kubelet/pods/95418b3b-b693-4b25-8ce8-967d233a1e54/volumes" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.689379 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41354a7-2b2f-4f4e-beb8-940543ae2e44" path="/var/lib/kubelet/pods/c41354a7-2b2f-4f4e-beb8-940543ae2e44/volumes" Feb 01 06:58:51 crc kubenswrapper[4546]: I0201 06:58:51.689902 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1d825a-ca7c-4a01-9f10-52876f202ef6" path="/var/lib/kubelet/pods/dd1d825a-ca7c-4a01-9f10-52876f202ef6/volumes" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.120619 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.535299 4546 generic.go:334] "Generic (PLEG): container finished" podID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerID="17af906129f9c6050dc52181cc34cf5f81be48ec98c3ba3a78a969f1b8542cf0" exitCode=1 Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.535454 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" event={"ID":"e1c27e30-fbbd-41e2-90d4-142797e326c8","Type":"ContainerDied","Data":"17af906129f9c6050dc52181cc34cf5f81be48ec98c3ba3a78a969f1b8542cf0"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.536541 4546 scope.go:117] "RemoveContainer" containerID="17af906129f9c6050dc52181cc34cf5f81be48ec98c3ba3a78a969f1b8542cf0" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.537530 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d9df55579-82dts" event={"ID":"3e9aee44-bbac-4f06-8187-cad533ab8a87","Type":"ContainerStarted","Data":"ec34fbbbf72bd0da05b1e7585139becb68c1849e9bf394a9953260a94da7ae9d"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.538706 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.540172 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76d966855c-lglrc" event={"ID":"358b4015-270f-4f22-918f-8c12b60603a3","Type":"ContainerStarted","Data":"e17468f84373339bf7c87bcab7eab0035aaf2c11f49b15dc8790bdd21514ee76"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.540267 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.541887 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8677d45756-xk5bt" event={"ID":"0e626c44-78bc-403d-98cd-2a6b09ab189e","Type":"ContainerStarted","Data":"914048b261dff01647e7f02f96ed860b06169c9ecb7e57191022b03e2163a698"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.542001 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.543132 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerStarted","Data":"367d65072352ae4f3d0a4172d368b7193e04e45e64f4412d66d18b521d6f04ed"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.544518 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" event={"ID":"2bde2be3-e30f-4116-bf00-2e6816dd43dc","Type":"ContainerStarted","Data":"515ce3ed30b3b51576f11c3a16ce8355922f909ed0ce5a6cb848a8c229cef5a1"} Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.544986 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.572245 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-76d966855c-lglrc" podStartSLOduration=11.572230091 podStartE2EDuration="11.572230091s" podCreationTimestamp="2026-02-01 06:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:52.567953449 +0000 UTC m=+963.218889466" watchObservedRunningTime="2026-02-01 06:58:52.572230091 +0000 UTC m=+963.223166107" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.658113 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" podStartSLOduration=8.658086177 podStartE2EDuration="8.658086177s" podCreationTimestamp="2026-02-01 06:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:52.628169983 +0000 UTC m=+963.279105999" watchObservedRunningTime="2026-02-01 06:58:52.658086177 +0000 UTC m=+963.309022193" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.658593 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.710155 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6d9df55579-82dts" podStartSLOduration=10.710136457 podStartE2EDuration="10.710136457s" podCreationTimestamp="2026-02-01 06:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:52.674769418 +0000 UTC m=+963.325705434" watchObservedRunningTime="2026-02-01 06:58:52.710136457 +0000 UTC m=+963.361072473" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.710244 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-8677d45756-xk5bt" podStartSLOduration=8.710240203 podStartE2EDuration="8.710240203s" podCreationTimestamp="2026-02-01 06:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:58:52.6976504 +0000 UTC m=+963.348586416" watchObservedRunningTime="2026-02-01 06:58:52.710240203 +0000 UTC m=+963.361176209" Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.804403 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.804949 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-748cdb7884-m5r26" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-api" containerID="cri-o://8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999" gracePeriod=30 Feb 01 06:58:52 crc kubenswrapper[4546]: I0201 06:58:52.805407 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-748cdb7884-m5r26" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" containerID="cri-o://bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab" gracePeriod=30 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.515942 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.530345 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-868f7bb468-rfpkj" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.572705 4546 generic.go:334] "Generic (PLEG): container finished" podID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerID="bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab" exitCode=0 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.572774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerDied","Data":"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab"} Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.585793 4546 generic.go:334] "Generic (PLEG): container finished" podID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" exitCode=1 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.585909 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" event={"ID":"e1c27e30-fbbd-41e2-90d4-142797e326c8","Type":"ContainerDied","Data":"adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b"} Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.585948 4546 scope.go:117] "RemoveContainer" containerID="17af906129f9c6050dc52181cc34cf5f81be48ec98c3ba3a78a969f1b8542cf0" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.586812 4546 scope.go:117] "RemoveContainer" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" Feb 01 06:58:53 crc kubenswrapper[4546]: E0201 06:58:53.587171 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54876bc7f7-6wwtv_openstack(e1c27e30-fbbd-41e2-90d4-142797e326c8)\"" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.593581 4546 generic.go:334] "Generic (PLEG): container finished" podID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerID="ec34fbbbf72bd0da05b1e7585139becb68c1849e9bf394a9953260a94da7ae9d" exitCode=1 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.593682 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d9df55579-82dts" event={"ID":"3e9aee44-bbac-4f06-8187-cad533ab8a87","Type":"ContainerDied","Data":"ec34fbbbf72bd0da05b1e7585139becb68c1849e9bf394a9953260a94da7ae9d"} Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.594434 4546 scope.go:117] "RemoveContainer" containerID="ec34fbbbf72bd0da05b1e7585139becb68c1849e9bf394a9953260a94da7ae9d" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.598105 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerStarted","Data":"6abc00dd15ff054997ffff2c619d3aaee4441689b3a636dd0dfeac0c4dd126c8"} Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.643917 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.644165 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7587b5bb54-sqc4h" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-log" containerID="cri-o://3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5" gracePeriod=30 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.644572 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7587b5bb54-sqc4h" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-api" containerID="cri-o://e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6" gracePeriod=30 Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.799080 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:58:53 crc kubenswrapper[4546]: I0201 06:58:53.878526 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.610980 4546 generic.go:334] "Generic (PLEG): container finished" podID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerID="fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921" exitCode=1 Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.611735 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d9df55579-82dts" event={"ID":"3e9aee44-bbac-4f06-8187-cad533ab8a87","Type":"ContainerDied","Data":"fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921"} Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.611926 4546 scope.go:117] "RemoveContainer" containerID="fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921" Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.612061 4546 scope.go:117] "RemoveContainer" containerID="ec34fbbbf72bd0da05b1e7585139becb68c1849e9bf394a9953260a94da7ae9d" Feb 01 06:58:54 crc kubenswrapper[4546]: E0201 06:58:54.612308 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d9df55579-82dts_openstack(3e9aee44-bbac-4f06-8187-cad533ab8a87)\"" pod="openstack/heat-api-6d9df55579-82dts" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.616241 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerStarted","Data":"5a7877531d717d1ff1fd6822147f05f056f4168e1b8e61a907a6a064a3714785"} Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.633904 4546 generic.go:334] "Generic (PLEG): container finished" podID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerID="3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5" exitCode=143 Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.633995 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerDied","Data":"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5"} Feb 01 06:58:54 crc kubenswrapper[4546]: I0201 06:58:54.637734 4546 scope.go:117] "RemoveContainer" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" Feb 01 06:58:54 crc kubenswrapper[4546]: E0201 06:58:54.637921 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54876bc7f7-6wwtv_openstack(e1c27e30-fbbd-41e2-90d4-142797e326c8)\"" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.420943 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.421597 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.421724 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.422536 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.422696 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d" gracePeriod=600 Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.647602 4546 scope.go:117] "RemoveContainer" containerID="fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921" Feb 01 06:58:55 crc kubenswrapper[4546]: E0201 06:58:55.648041 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d9df55579-82dts_openstack(3e9aee44-bbac-4f06-8187-cad533ab8a87)\"" pod="openstack/heat-api-6d9df55579-82dts" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.649894 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerStarted","Data":"cbda65d3806cbb4d8f56d6368f3212621e6f3dc67dd5228befc308d51f8c92e2"} Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.652628 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d" exitCode=0 Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.652697 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d"} Feb 01 06:58:55 crc kubenswrapper[4546]: I0201 06:58:55.652746 4546 scope.go:117] "RemoveContainer" containerID="35c1ceef8d4590b6c0af653c1017461916a166a5c1d2dcb5faa5ca14e92cf91e" Feb 01 06:58:56 crc kubenswrapper[4546]: I0201 06:58:56.668077 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6"} Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.436211 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.437008 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.437564 4546 scope.go:117] "RemoveContainer" containerID="fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921" Feb 01 06:58:57 crc kubenswrapper[4546]: E0201 06:58:57.437827 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-6d9df55579-82dts_openstack(3e9aee44-bbac-4f06-8187-cad533ab8a87)\"" pod="openstack/heat-api-6d9df55579-82dts" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.454108 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.454259 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.454716 4546 scope.go:117] "RemoveContainer" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" Feb 01 06:58:57 crc kubenswrapper[4546]: E0201 06:58:57.455040 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54876bc7f7-6wwtv_openstack(e1c27e30-fbbd-41e2-90d4-142797e326c8)\"" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.560945 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.638618 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639171 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqfh9\" (UniqueName: \"kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639422 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639451 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639527 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639542 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.639585 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts\") pod \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\" (UID: \"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9\") " Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.643139 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs" (OuterVolumeSpecName: "logs") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.654987 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts" (OuterVolumeSpecName: "scripts") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.661079 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9" (OuterVolumeSpecName: "kube-api-access-bqfh9") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "kube-api-access-bqfh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.734801 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerStarted","Data":"3285b5a0552f69aab50f787e1bc0c7dcf59fa06c8cc7df6b09e1767e3930ba44"} Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.736362 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.745751 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqfh9\" (UniqueName: \"kubernetes.io/projected/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-kube-api-access-bqfh9\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.745789 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.745800 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.759589 4546 generic.go:334] "Generic (PLEG): container finished" podID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerID="e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6" exitCode=0 Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.759804 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.93830692 podStartE2EDuration="6.759792678s" podCreationTimestamp="2026-02-01 06:58:51 +0000 UTC" firstStartedPulling="2026-02-01 06:58:52.136402456 +0000 UTC m=+962.787338472" lastFinishedPulling="2026-02-01 06:58:56.957888213 +0000 UTC m=+967.608824230" observedRunningTime="2026-02-01 06:58:57.759384729 +0000 UTC m=+968.410320745" watchObservedRunningTime="2026-02-01 06:58:57.759792678 +0000 UTC m=+968.410728694" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.760435 4546 scope.go:117] "RemoveContainer" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" Feb 01 06:58:57 crc kubenswrapper[4546]: E0201 06:58:57.760755 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-54876bc7f7-6wwtv_openstack(e1c27e30-fbbd-41e2-90d4-142797e326c8)\"" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.761165 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7587b5bb54-sqc4h" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.761937 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerDied","Data":"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6"} Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.762025 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7587b5bb54-sqc4h" event={"ID":"06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9","Type":"ContainerDied","Data":"fb4f3b5dd3310223b1799ba189beca45b1eae3f79b486f7516098e204fd0b1d8"} Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.762097 4546 scope.go:117] "RemoveContainer" containerID="e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.782282 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.797349 4546 scope.go:117] "RemoveContainer" containerID="3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.807557 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data" (OuterVolumeSpecName: "config-data") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.821702 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.837050 4546 scope.go:117] "RemoveContainer" containerID="e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6" Feb 01 06:58:57 crc kubenswrapper[4546]: E0201 06:58:57.837961 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6\": container with ID starting with e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6 not found: ID does not exist" containerID="e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.838053 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6"} err="failed to get container status \"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6\": rpc error: code = NotFound desc = could not find container \"e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6\": container with ID starting with e4fb793f154390af6724c3f17630571c354ccb1cdcedf9050eb51b973757f9f6 not found: ID does not exist" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.838131 4546 scope.go:117] "RemoveContainer" containerID="3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5" Feb 01 06:58:57 crc kubenswrapper[4546]: E0201 06:58:57.839911 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5\": container with ID starting with 3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5 not found: ID does not exist" containerID="3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.840008 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5"} err="failed to get container status \"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5\": rpc error: code = NotFound desc = could not find container \"3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5\": container with ID starting with 3068843bedbd1930bc502f1862c05f95b030832641ec684d9bc98e8f394b7fa5 not found: ID does not exist" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.848221 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.848264 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.848276 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.848460 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" (UID: "06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:58:57 crc kubenswrapper[4546]: I0201 06:58:57.950629 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:58:58 crc kubenswrapper[4546]: I0201 06:58:58.163470 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:58:58 crc kubenswrapper[4546]: I0201 06:58:58.181016 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7587b5bb54-sqc4h"] Feb 01 06:58:59 crc kubenswrapper[4546]: I0201 06:58:59.668462 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" path="/var/lib/kubelet/pods/06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9/volumes" Feb 01 06:59:01 crc kubenswrapper[4546]: I0201 06:59:01.908122 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:01 crc kubenswrapper[4546]: I0201 06:59:01.909217 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-central-agent" containerID="cri-o://6abc00dd15ff054997ffff2c619d3aaee4441689b3a636dd0dfeac0c4dd126c8" gracePeriod=30 Feb 01 06:59:01 crc kubenswrapper[4546]: I0201 06:59:01.909970 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="proxy-httpd" containerID="cri-o://3285b5a0552f69aab50f787e1bc0c7dcf59fa06c8cc7df6b09e1767e3930ba44" gracePeriod=30 Feb 01 06:59:01 crc kubenswrapper[4546]: I0201 06:59:01.910058 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="sg-core" containerID="cri-o://cbda65d3806cbb4d8f56d6368f3212621e6f3dc67dd5228befc308d51f8c92e2" gracePeriod=30 Feb 01 06:59:01 crc kubenswrapper[4546]: I0201 06:59:01.910117 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-notification-agent" containerID="cri-o://5a7877531d717d1ff1fd6822147f05f056f4168e1b8e61a907a6a064a3714785" gracePeriod=30 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.026686 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-667c96b6cb-hf7jb" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.101844 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.197233 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-8677d45756-xk5bt" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.261738 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.449020 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-76d966855c-lglrc" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.512384 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.513933 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-66c6d5d4cd-sncfn" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" containerName="heat-engine" containerID="cri-o://ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" gracePeriod=60 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.772609 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.823918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" event={"ID":"e1c27e30-fbbd-41e2-90d4-142797e326c8","Type":"ContainerDied","Data":"7ecd769b6a1e46d990025bef1861dae972f622fb372e472d402868787dc49d23"} Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.824001 4546 scope.go:117] "RemoveContainer" containerID="adae2545d074a06c430ae7f551f3503426cfd200b5ade58b510a18387d7d751b" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.824171 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-54876bc7f7-6wwtv" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844618 4546 generic.go:334] "Generic (PLEG): container finished" podID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerID="3285b5a0552f69aab50f787e1bc0c7dcf59fa06c8cc7df6b09e1767e3930ba44" exitCode=0 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844647 4546 generic.go:334] "Generic (PLEG): container finished" podID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerID="cbda65d3806cbb4d8f56d6368f3212621e6f3dc67dd5228befc308d51f8c92e2" exitCode=2 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844661 4546 generic.go:334] "Generic (PLEG): container finished" podID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerID="5a7877531d717d1ff1fd6822147f05f056f4168e1b8e61a907a6a064a3714785" exitCode=0 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844669 4546 generic.go:334] "Generic (PLEG): container finished" podID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerID="6abc00dd15ff054997ffff2c619d3aaee4441689b3a636dd0dfeac0c4dd126c8" exitCode=0 Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844694 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerDied","Data":"3285b5a0552f69aab50f787e1bc0c7dcf59fa06c8cc7df6b09e1767e3930ba44"} Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844726 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerDied","Data":"cbda65d3806cbb4d8f56d6368f3212621e6f3dc67dd5228befc308d51f8c92e2"} Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844738 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerDied","Data":"5a7877531d717d1ff1fd6822147f05f056f4168e1b8e61a907a6a064a3714785"} Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.844747 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerDied","Data":"6abc00dd15ff054997ffff2c619d3aaee4441689b3a636dd0dfeac0c4dd126c8"} Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.895976 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom\") pod \"e1c27e30-fbbd-41e2-90d4-142797e326c8\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.897175 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") pod \"e1c27e30-fbbd-41e2-90d4-142797e326c8\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.897286 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle\") pod \"e1c27e30-fbbd-41e2-90d4-142797e326c8\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.897342 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.897427 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqtz9\" (UniqueName: \"kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9\") pod \"e1c27e30-fbbd-41e2-90d4-142797e326c8\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.906692 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1c27e30-fbbd-41e2-90d4-142797e326c8" (UID: "e1c27e30-fbbd-41e2-90d4-142797e326c8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.908725 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9" (OuterVolumeSpecName: "kube-api-access-gqtz9") pod "e1c27e30-fbbd-41e2-90d4-142797e326c8" (UID: "e1c27e30-fbbd-41e2-90d4-142797e326c8"). InnerVolumeSpecName "kube-api-access-gqtz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:02 crc kubenswrapper[4546]: I0201 06:59:02.940951 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c27e30-fbbd-41e2-90d4-142797e326c8" (UID: "e1c27e30-fbbd-41e2-90d4-142797e326c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.001059 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data" (OuterVolumeSpecName: "config-data") pod "e1c27e30-fbbd-41e2-90d4-142797e326c8" (UID: "e1c27e30-fbbd-41e2-90d4-142797e326c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.002456 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle\") pod \"3e9aee44-bbac-4f06-8187-cad533ab8a87\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.002571 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkmww\" (UniqueName: \"kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww\") pod \"3e9aee44-bbac-4f06-8187-cad533ab8a87\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.002637 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data\") pod \"3e9aee44-bbac-4f06-8187-cad533ab8a87\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.002761 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") pod \"e1c27e30-fbbd-41e2-90d4-142797e326c8\" (UID: \"e1c27e30-fbbd-41e2-90d4-142797e326c8\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.002790 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom\") pod \"3e9aee44-bbac-4f06-8187-cad533ab8a87\" (UID: \"3e9aee44-bbac-4f06-8187-cad533ab8a87\") " Feb 01 06:59:03 crc kubenswrapper[4546]: W0201 06:59:03.003369 4546 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e1c27e30-fbbd-41e2-90d4-142797e326c8/volumes/kubernetes.io~secret/config-data Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.003415 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data" (OuterVolumeSpecName: "config-data") pod "e1c27e30-fbbd-41e2-90d4-142797e326c8" (UID: "e1c27e30-fbbd-41e2-90d4-142797e326c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.004226 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.004251 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.004263 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c27e30-fbbd-41e2-90d4-142797e326c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.004276 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqtz9\" (UniqueName: \"kubernetes.io/projected/e1c27e30-fbbd-41e2-90d4-142797e326c8-kube-api-access-gqtz9\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.008186 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e9aee44-bbac-4f06-8187-cad533ab8a87" (UID: "3e9aee44-bbac-4f06-8187-cad533ab8a87"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.013005 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww" (OuterVolumeSpecName: "kube-api-access-qkmww") pod "3e9aee44-bbac-4f06-8187-cad533ab8a87" (UID: "3e9aee44-bbac-4f06-8187-cad533ab8a87"). InnerVolumeSpecName "kube-api-access-qkmww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.068544 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e9aee44-bbac-4f06-8187-cad533ab8a87" (UID: "3e9aee44-bbac-4f06-8187-cad533ab8a87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.107572 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkmww\" (UniqueName: \"kubernetes.io/projected/3e9aee44-bbac-4f06-8187-cad533ab8a87-kube-api-access-qkmww\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.107601 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.107611 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.127059 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data" (OuterVolumeSpecName: "config-data") pod "3e9aee44-bbac-4f06-8187-cad533ab8a87" (UID: "3e9aee44-bbac-4f06-8187-cad533ab8a87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.181764 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.201476 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.209609 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-54876bc7f7-6wwtv"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.211452 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e9aee44-bbac-4f06-8187-cad533ab8a87-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312095 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312308 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312345 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312413 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312455 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312486 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdmgq\" (UniqueName: \"kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.312525 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd\") pod \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\" (UID: \"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.313095 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.313522 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.324014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts" (OuterVolumeSpecName: "scripts") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.324151 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c27e30_fbbd_41e2_90d4_142797e326c8.slice/crio-7ecd769b6a1e46d990025bef1861dae972f622fb372e472d402868787dc49d23\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c27e30_fbbd_41e2_90d4_142797e326c8.slice\": RecentStats: unable to find data in memory cache]" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.352149 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.355734 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq" (OuterVolumeSpecName: "kube-api-access-gdmgq") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "kube-api-access-gdmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.415966 4546 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.416055 4546 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.416109 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.416167 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdmgq\" (UniqueName: \"kubernetes.io/projected/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-kube-api-access-gdmgq\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.416224 4546 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.449101 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.468191 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data" (OuterVolumeSpecName: "config-data") pod "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" (UID: "8b85ccde-7e7a-4c2d-b003-a73fc46d9a50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.518285 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.518318 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.589008 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.677334 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" path="/var/lib/kubelet/pods/e1c27e30-fbbd-41e2-90d4-142797e326c8/volumes" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.730672 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config\") pod \"49b573cc-fc40-4ae5-825b-84e1723756e7\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.730865 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs\") pod \"49b573cc-fc40-4ae5-825b-84e1723756e7\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.730951 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle\") pod \"49b573cc-fc40-4ae5-825b-84e1723756e7\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.731069 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2dct\" (UniqueName: \"kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct\") pod \"49b573cc-fc40-4ae5-825b-84e1723756e7\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.731106 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config\") pod \"49b573cc-fc40-4ae5-825b-84e1723756e7\" (UID: \"49b573cc-fc40-4ae5-825b-84e1723756e7\") " Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.741270 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct" (OuterVolumeSpecName: "kube-api-access-j2dct") pod "49b573cc-fc40-4ae5-825b-84e1723756e7" (UID: "49b573cc-fc40-4ae5-825b-84e1723756e7"). InnerVolumeSpecName "kube-api-access-j2dct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.741456 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "49b573cc-fc40-4ae5-825b-84e1723756e7" (UID: "49b573cc-fc40-4ae5-825b-84e1723756e7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.757533 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.765064 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.766630 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.766715 4546 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-66c6d5d4cd-sncfn" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" containerName="heat-engine" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.832965 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b573cc-fc40-4ae5-825b-84e1723756e7" (UID: "49b573cc-fc40-4ae5-825b-84e1723756e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.834780 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.834919 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2dct\" (UniqueName: \"kubernetes.io/projected/49b573cc-fc40-4ae5-825b-84e1723756e7-kube-api-access-j2dct\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.835015 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.838414 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config" (OuterVolumeSpecName: "config") pod "49b573cc-fc40-4ae5-825b-84e1723756e7" (UID: "49b573cc-fc40-4ae5-825b-84e1723756e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.851977 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "49b573cc-fc40-4ae5-825b-84e1723756e7" (UID: "49b573cc-fc40-4ae5-825b-84e1723756e7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864421 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wk829"] Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864791 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864810 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864827 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864833 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864844 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864850 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864871 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-central-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864877 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-central-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864887 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864893 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864901 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="sg-core" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864906 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="sg-core" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864915 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864922 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-api" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864928 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-log" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864934 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-log" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864951 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="proxy-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864956 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="proxy-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864971 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-notification-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864976 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-notification-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: E0201 06:59:03.864991 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.864996 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865172 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-central-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865185 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865193 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="proxy-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865204 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865214 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="ceilometer-notification-agent" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865223 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-api" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865231 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865240 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" containerName="sg-core" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865249 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c27e30-fbbd-41e2-90d4-142797e326c8" containerName="heat-cfnapi" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865260 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerName="neutron-httpd" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.865270 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="06dfb67b-6bab-4ca5-8fcb-e938e6cdd6d9" containerName="placement-log" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.867343 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.869138 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6d9df55579-82dts" event={"ID":"3e9aee44-bbac-4f06-8187-cad533ab8a87","Type":"ContainerDied","Data":"c4410971f6b899373326ef879474711509032a387b35892628a324bc98c2f780"} Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.869187 4546 scope.go:117] "RemoveContainer" containerID="fdf4cd2d09eda917ace2f1cd615134e663df01212b679a8729ed343457136921" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.869294 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6d9df55579-82dts" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.875600 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b85ccde-7e7a-4c2d-b003-a73fc46d9a50","Type":"ContainerDied","Data":"367d65072352ae4f3d0a4172d368b7193e04e45e64f4412d66d18b521d6f04ed"} Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.875637 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.883474 4546 generic.go:334] "Generic (PLEG): container finished" podID="49b573cc-fc40-4ae5-825b-84e1723756e7" containerID="8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999" exitCode=0 Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.883527 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerDied","Data":"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999"} Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.883591 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748cdb7884-m5r26" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.883613 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748cdb7884-m5r26" event={"ID":"49b573cc-fc40-4ae5-825b-84e1723756e7","Type":"ContainerDied","Data":"5d4f3f345f84654584edb69659d9d0057a5004251db3715ade660b1cc289aaed"} Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.887559 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wk829"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.905010 4546 scope.go:117] "RemoveContainer" containerID="3285b5a0552f69aab50f787e1bc0c7dcf59fa06c8cc7df6b09e1767e3930ba44" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.938593 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrlfp\" (UniqueName: \"kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.938644 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.938743 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.938755 4546 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b573cc-fc40-4ae5-825b-84e1723756e7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.961125 4546 scope.go:117] "RemoveContainer" containerID="cbda65d3806cbb4d8f56d6368f3212621e6f3dc67dd5228befc308d51f8c92e2" Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.967469 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.978452 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6d9df55579-82dts"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.988056 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:03 crc kubenswrapper[4546]: I0201 06:59:03.992035 4546 scope.go:117] "RemoveContainer" containerID="5a7877531d717d1ff1fd6822147f05f056f4168e1b8e61a907a6a064a3714785" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.029575 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4kbrz"] Feb 01 06:59:04 crc kubenswrapper[4546]: E0201 06:59:04.030107 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.030128 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.030387 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" containerName="heat-api" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.031141 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.038617 4546 scope.go:117] "RemoveContainer" containerID="6abc00dd15ff054997ffff2c619d3aaee4441689b3a636dd0dfeac0c4dd126c8" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.040536 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrlfp\" (UniqueName: \"kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.040583 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.041564 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.054556 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.062528 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4kbrz"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.067904 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.074801 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-748cdb7884-m5r26"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.076523 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.078709 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.081209 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.083524 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.083737 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.089481 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrlfp\" (UniqueName: \"kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp\") pod \"nova-api-db-create-wk829\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.089533 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lvf28"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.091001 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.097476 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cfac-account-create-update-lhf9t"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.098617 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.104223 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.115126 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cfac-account-create-update-lhf9t"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.130410 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lvf28"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144046 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ph5v\" (UniqueName: \"kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144077 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144126 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144148 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144255 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144298 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144439 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144533 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppx8t\" (UniqueName: \"kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.144640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.165761 4546 scope.go:117] "RemoveContainer" containerID="bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.188989 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.191935 4546 scope.go:117] "RemoveContainer" containerID="8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.246620 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ph5v\" (UniqueName: \"kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.246907 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.246940 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.246959 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.246997 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2q8k\" (UniqueName: \"kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247037 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247060 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247076 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247117 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247151 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkng7\" (UniqueName: \"kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247175 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppx8t\" (UniqueName: \"kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247193 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.247237 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.248184 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.248728 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.249378 4546 scope.go:117] "RemoveContainer" containerID="bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.249828 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.252592 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.253137 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: E0201 06:59:04.255118 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab\": container with ID starting with bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab not found: ID does not exist" containerID="bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.255145 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab"} err="failed to get container status \"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab\": rpc error: code = NotFound desc = could not find container \"bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab\": container with ID starting with bc91c97165c1132727408426aae4172cf960fd37f945840dd23fbde220607dab not found: ID does not exist" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.255168 4546 scope.go:117] "RemoveContainer" containerID="8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999" Feb 01 06:59:04 crc kubenswrapper[4546]: E0201 06:59:04.255427 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999\": container with ID starting with 8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999 not found: ID does not exist" containerID="8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.255447 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999"} err="failed to get container status \"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999\": rpc error: code = NotFound desc = could not find container \"8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999\": container with ID starting with 8ebba835036a660a4eed60dcde3691a8507a51c7dffdf8e9a59a779060060999 not found: ID does not exist" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.265758 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.271408 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.277544 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8edb-account-create-update-dzs8g"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.278652 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.282193 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.287617 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppx8t\" (UniqueName: \"kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t\") pod \"ceilometer-0\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.307109 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ph5v\" (UniqueName: \"kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v\") pod \"nova-cell0-db-create-4kbrz\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.312630 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8edb-account-create-update-dzs8g"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348597 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348684 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2pb\" (UniqueName: \"kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348755 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkng7\" (UniqueName: \"kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348798 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348828 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.348961 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2q8k\" (UniqueName: \"kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.349793 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.352289 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.367548 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2q8k\" (UniqueName: \"kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k\") pod \"nova-cell1-db-create-lvf28\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.370731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkng7\" (UniqueName: \"kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7\") pod \"nova-api-cfac-account-create-update-lhf9t\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.457613 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.457447 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2pb\" (UniqueName: \"kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.457761 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.458565 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.469014 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.474774 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9d33-account-create-update-jz942"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.495728 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9d33-account-create-update-jz942"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.496151 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.496600 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.498813 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.511571 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.526944 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2pb\" (UniqueName: \"kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb\") pod \"nova-cell0-8edb-account-create-update-dzs8g\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.564850 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.565293 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzr7\" (UniqueName: \"kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.644445 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wk829"] Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.658453 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.674589 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzr7\" (UniqueName: \"kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.674792 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.675701 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.694607 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzr7\" (UniqueName: \"kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7\") pod \"nova-cell1-9d33-account-create-update-jz942\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.842265 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:04 crc kubenswrapper[4546]: I0201 06:59:04.938559 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wk829" event={"ID":"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b","Type":"ContainerStarted","Data":"a3a507cc0da81b2ffac1e39d03f8f3a693713cb3f69608e0e3b8feac38a261c0"} Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.107578 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4kbrz"] Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.532145 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lvf28"] Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.565671 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cfac-account-create-update-lhf9t"] Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.588539 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.609094 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8edb-account-create-update-dzs8g"] Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.689089 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e9aee44-bbac-4f06-8187-cad533ab8a87" path="/var/lib/kubelet/pods/3e9aee44-bbac-4f06-8187-cad533ab8a87/volumes" Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.690034 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b573cc-fc40-4ae5-825b-84e1723756e7" path="/var/lib/kubelet/pods/49b573cc-fc40-4ae5-825b-84e1723756e7/volumes" Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.690642 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b85ccde-7e7a-4c2d-b003-a73fc46d9a50" path="/var/lib/kubelet/pods/8b85ccde-7e7a-4c2d-b003-a73fc46d9a50/volumes" Feb 01 06:59:05 crc kubenswrapper[4546]: I0201 06:59:05.737582 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9d33-account-create-update-jz942"] Feb 01 06:59:05 crc kubenswrapper[4546]: W0201 06:59:05.748508 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2ce9cfb_87da_40b6_9676_492ba3cce8b6.slice/crio-feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7 WatchSource:0}: Error finding container feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7: Status 404 returned error can't find the container with id feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7 Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.019625 4546 generic.go:334] "Generic (PLEG): container finished" podID="77705152-25fc-47d3-b448-00144a74f075" containerID="f6d72ac5c09ae960f171fdfd21cbe511552cc2f3994e4cda668dc78bf8381031" exitCode=0 Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.019842 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4kbrz" event={"ID":"77705152-25fc-47d3-b448-00144a74f075","Type":"ContainerDied","Data":"f6d72ac5c09ae960f171fdfd21cbe511552cc2f3994e4cda668dc78bf8381031"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.019976 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4kbrz" event={"ID":"77705152-25fc-47d3-b448-00144a74f075","Type":"ContainerStarted","Data":"deb77350c34aa45594a72316d6649b6257cc4a89230e3c5320fa0ff847ecc58c"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.031110 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerStarted","Data":"a79221df8f0a8efb355b65be309b092e31aff0d2b5ddb4b095ad5eb175bb741c"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.033479 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9d33-account-create-update-jz942" event={"ID":"b2ce9cfb-87da-40b6-9676-492ba3cce8b6","Type":"ContainerStarted","Data":"feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.041247 4546 generic.go:334] "Generic (PLEG): container finished" podID="ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" containerID="9067223864ae5d28608217cc1cfc5125d7a838015e0d219c726a183ba312df27" exitCode=0 Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.041334 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wk829" event={"ID":"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b","Type":"ContainerDied","Data":"9067223864ae5d28608217cc1cfc5125d7a838015e0d219c726a183ba312df27"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.064078 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" event={"ID":"0e59900e-a73b-4d2c-be24-130f43e15f6d","Type":"ContainerStarted","Data":"546bb6203807d26c47f49c87b2f6f262a886319d9bdd1cfe28e3bf5432592c2b"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.070531 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lvf28" event={"ID":"ab34a556-843f-4e9a-becd-82452d0ad83d","Type":"ContainerStarted","Data":"53d07205bba756cefa74b66eb5b41c88eb9ad416b156b613ed807c60602b41a1"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.089982 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cfac-account-create-update-lhf9t" event={"ID":"b60f534b-2c84-4054-99c1-c682e0a58c7f","Type":"ContainerStarted","Data":"6e584a35b476bb05a436a3304b90e7f844f17eeb4de6dbfd2daac6677584f32a"} Feb 01 06:59:06 crc kubenswrapper[4546]: I0201 06:59:06.623590 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.106761 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerStarted","Data":"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41"} Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.112849 4546 generic.go:334] "Generic (PLEG): container finished" podID="b2ce9cfb-87da-40b6-9676-492ba3cce8b6" containerID="d7186a6dac4bf0bb0c2e5151c6b3c2e14f328f4a1fa681ca306d410166ac5f18" exitCode=0 Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.112923 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9d33-account-create-update-jz942" event={"ID":"b2ce9cfb-87da-40b6-9676-492ba3cce8b6","Type":"ContainerDied","Data":"d7186a6dac4bf0bb0c2e5151c6b3c2e14f328f4a1fa681ca306d410166ac5f18"} Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.114737 4546 generic.go:334] "Generic (PLEG): container finished" podID="0e59900e-a73b-4d2c-be24-130f43e15f6d" containerID="bcafe1ab9e5c4ce3f8238bb18507759966fde00caa59ff2e1f79e841a950f01e" exitCode=0 Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.115000 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" event={"ID":"0e59900e-a73b-4d2c-be24-130f43e15f6d","Type":"ContainerDied","Data":"bcafe1ab9e5c4ce3f8238bb18507759966fde00caa59ff2e1f79e841a950f01e"} Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.116563 4546 generic.go:334] "Generic (PLEG): container finished" podID="ab34a556-843f-4e9a-becd-82452d0ad83d" containerID="34e90168d4d7b9f5e17f2abe8baf07cc356ed0e18d9351910cae5edfc03efcec" exitCode=0 Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.116695 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lvf28" event={"ID":"ab34a556-843f-4e9a-becd-82452d0ad83d","Type":"ContainerDied","Data":"34e90168d4d7b9f5e17f2abe8baf07cc356ed0e18d9351910cae5edfc03efcec"} Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.118121 4546 generic.go:334] "Generic (PLEG): container finished" podID="b60f534b-2c84-4054-99c1-c682e0a58c7f" containerID="730f1d5d3c50e0bc4c7ff5c194eb5f8ec07c3d3089a8f4a08fef77bb51f056ef" exitCode=0 Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.118503 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cfac-account-create-update-lhf9t" event={"ID":"b60f534b-2c84-4054-99c1-c682e0a58c7f","Type":"ContainerDied","Data":"730f1d5d3c50e0bc4c7ff5c194eb5f8ec07c3d3089a8f4a08fef77bb51f056ef"} Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.749003 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.753015 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.821714 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ph5v\" (UniqueName: \"kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v\") pod \"77705152-25fc-47d3-b448-00144a74f075\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.821901 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts\") pod \"77705152-25fc-47d3-b448-00144a74f075\" (UID: \"77705152-25fc-47d3-b448-00144a74f075\") " Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.822103 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts\") pod \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.822197 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrlfp\" (UniqueName: \"kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp\") pod \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\" (UID: \"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b\") " Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.824178 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77705152-25fc-47d3-b448-00144a74f075" (UID: "77705152-25fc-47d3-b448-00144a74f075"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.824508 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" (UID: "ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.833094 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp" (OuterVolumeSpecName: "kube-api-access-mrlfp") pod "ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" (UID: "ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b"). InnerVolumeSpecName "kube-api-access-mrlfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.842833 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v" (OuterVolumeSpecName: "kube-api-access-5ph5v") pod "77705152-25fc-47d3-b448-00144a74f075" (UID: "77705152-25fc-47d3-b448-00144a74f075"). InnerVolumeSpecName "kube-api-access-5ph5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.925673 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ph5v\" (UniqueName: \"kubernetes.io/projected/77705152-25fc-47d3-b448-00144a74f075-kube-api-access-5ph5v\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.925718 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77705152-25fc-47d3-b448-00144a74f075-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.925734 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:07 crc kubenswrapper[4546]: I0201 06:59:07.925745 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrlfp\" (UniqueName: \"kubernetes.io/projected/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b-kube-api-access-mrlfp\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.129578 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4kbrz" event={"ID":"77705152-25fc-47d3-b448-00144a74f075","Type":"ContainerDied","Data":"deb77350c34aa45594a72316d6649b6257cc4a89230e3c5320fa0ff847ecc58c"} Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.131621 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb77350c34aa45594a72316d6649b6257cc4a89230e3c5320fa0ff847ecc58c" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.129596 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4kbrz" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.132546 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerStarted","Data":"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a"} Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.134871 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wk829" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.135983 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wk829" event={"ID":"ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b","Type":"ContainerDied","Data":"a3a507cc0da81b2ffac1e39d03f8f3a693713cb3f69608e0e3b8feac38a261c0"} Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.136038 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a507cc0da81b2ffac1e39d03f8f3a693713cb3f69608e0e3b8feac38a261c0" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.575313 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.650107 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts\") pod \"b60f534b-2c84-4054-99c1-c682e0a58c7f\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.650244 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkng7\" (UniqueName: \"kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7\") pod \"b60f534b-2c84-4054-99c1-c682e0a58c7f\" (UID: \"b60f534b-2c84-4054-99c1-c682e0a58c7f\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.651057 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b60f534b-2c84-4054-99c1-c682e0a58c7f" (UID: "b60f534b-2c84-4054-99c1-c682e0a58c7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.667988 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7" (OuterVolumeSpecName: "kube-api-access-dkng7") pod "b60f534b-2c84-4054-99c1-c682e0a58c7f" (UID: "b60f534b-2c84-4054-99c1-c682e0a58c7f"). InnerVolumeSpecName "kube-api-access-dkng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.752970 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b60f534b-2c84-4054-99c1-c682e0a58c7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.752994 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkng7\" (UniqueName: \"kubernetes.io/projected/b60f534b-2c84-4054-99c1-c682e0a58c7f-kube-api-access-dkng7\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.824177 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.828504 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.829291 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854151 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m2pb\" (UniqueName: \"kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb\") pod \"0e59900e-a73b-4d2c-be24-130f43e15f6d\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854239 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts\") pod \"ab34a556-843f-4e9a-becd-82452d0ad83d\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854260 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddzr7\" (UniqueName: \"kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7\") pod \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854462 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2q8k\" (UniqueName: \"kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k\") pod \"ab34a556-843f-4e9a-becd-82452d0ad83d\" (UID: \"ab34a556-843f-4e9a-becd-82452d0ad83d\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854605 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts\") pod \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\" (UID: \"b2ce9cfb-87da-40b6-9676-492ba3cce8b6\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.854662 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts\") pod \"0e59900e-a73b-4d2c-be24-130f43e15f6d\" (UID: \"0e59900e-a73b-4d2c-be24-130f43e15f6d\") " Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.855842 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e59900e-a73b-4d2c-be24-130f43e15f6d" (UID: "0e59900e-a73b-4d2c-be24-130f43e15f6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.856281 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab34a556-843f-4e9a-becd-82452d0ad83d" (UID: "ab34a556-843f-4e9a-becd-82452d0ad83d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.856313 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2ce9cfb-87da-40b6-9676-492ba3cce8b6" (UID: "b2ce9cfb-87da-40b6-9676-492ba3cce8b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.875344 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k" (OuterVolumeSpecName: "kube-api-access-j2q8k") pod "ab34a556-843f-4e9a-becd-82452d0ad83d" (UID: "ab34a556-843f-4e9a-becd-82452d0ad83d"). InnerVolumeSpecName "kube-api-access-j2q8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.879134 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb" (OuterVolumeSpecName: "kube-api-access-8m2pb") pod "0e59900e-a73b-4d2c-be24-130f43e15f6d" (UID: "0e59900e-a73b-4d2c-be24-130f43e15f6d"). InnerVolumeSpecName "kube-api-access-8m2pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.882044 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7" (OuterVolumeSpecName: "kube-api-access-ddzr7") pod "b2ce9cfb-87da-40b6-9676-492ba3cce8b6" (UID: "b2ce9cfb-87da-40b6-9676-492ba3cce8b6"). InnerVolumeSpecName "kube-api-access-ddzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956489 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m2pb\" (UniqueName: \"kubernetes.io/projected/0e59900e-a73b-4d2c-be24-130f43e15f6d-kube-api-access-8m2pb\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956529 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab34a556-843f-4e9a-becd-82452d0ad83d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956541 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddzr7\" (UniqueName: \"kubernetes.io/projected/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-kube-api-access-ddzr7\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956551 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2q8k\" (UniqueName: \"kubernetes.io/projected/ab34a556-843f-4e9a-becd-82452d0ad83d-kube-api-access-j2q8k\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956562 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2ce9cfb-87da-40b6-9676-492ba3cce8b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:08 crc kubenswrapper[4546]: I0201 06:59:08.956571 4546 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e59900e-a73b-4d2c-be24-130f43e15f6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.144324 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" event={"ID":"0e59900e-a73b-4d2c-be24-130f43e15f6d","Type":"ContainerDied","Data":"546bb6203807d26c47f49c87b2f6f262a886319d9bdd1cfe28e3bf5432592c2b"} Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.144369 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8edb-account-create-update-dzs8g" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.144386 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="546bb6203807d26c47f49c87b2f6f262a886319d9bdd1cfe28e3bf5432592c2b" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.147105 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lvf28" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.147103 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lvf28" event={"ID":"ab34a556-843f-4e9a-becd-82452d0ad83d","Type":"ContainerDied","Data":"53d07205bba756cefa74b66eb5b41c88eb9ad416b156b613ed807c60602b41a1"} Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.147620 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d07205bba756cefa74b66eb5b41c88eb9ad416b156b613ed807c60602b41a1" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.148739 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cfac-account-create-update-lhf9t" event={"ID":"b60f534b-2c84-4054-99c1-c682e0a58c7f","Type":"ContainerDied","Data":"6e584a35b476bb05a436a3304b90e7f844f17eeb4de6dbfd2daac6677584f32a"} Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.148786 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cfac-account-create-update-lhf9t" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.148779 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e584a35b476bb05a436a3304b90e7f844f17eeb4de6dbfd2daac6677584f32a" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.152505 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerStarted","Data":"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c"} Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.153546 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9d33-account-create-update-jz942" event={"ID":"b2ce9cfb-87da-40b6-9676-492ba3cce8b6","Type":"ContainerDied","Data":"feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7"} Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.153641 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feca276b2917b02b32e256725390ab36df23ce2f5d31676f0bedd0503b6c08a7" Feb 01 06:59:09 crc kubenswrapper[4546]: I0201 06:59:09.153775 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9d33-account-create-update-jz942" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.784191 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.817957 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom\") pod \"ea10db39-8540-4ff0-9a34-859b497605a9\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.818068 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data\") pod \"ea10db39-8540-4ff0-9a34-859b497605a9\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.818310 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle\") pod \"ea10db39-8540-4ff0-9a34-859b497605a9\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.818595 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvj7t\" (UniqueName: \"kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t\") pod \"ea10db39-8540-4ff0-9a34-859b497605a9\" (UID: \"ea10db39-8540-4ff0-9a34-859b497605a9\") " Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.838524 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t" (OuterVolumeSpecName: "kube-api-access-rvj7t") pod "ea10db39-8540-4ff0-9a34-859b497605a9" (UID: "ea10db39-8540-4ff0-9a34-859b497605a9"). InnerVolumeSpecName "kube-api-access-rvj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.843442 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea10db39-8540-4ff0-9a34-859b497605a9" (UID: "ea10db39-8540-4ff0-9a34-859b497605a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.904002 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea10db39-8540-4ff0-9a34-859b497605a9" (UID: "ea10db39-8540-4ff0-9a34-859b497605a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.926991 4546 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.927020 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.927030 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvj7t\" (UniqueName: \"kubernetes.io/projected/ea10db39-8540-4ff0-9a34-859b497605a9-kube-api-access-rvj7t\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:10 crc kubenswrapper[4546]: I0201 06:59:10.966353 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data" (OuterVolumeSpecName: "config-data") pod "ea10db39-8540-4ff0-9a34-859b497605a9" (UID: "ea10db39-8540-4ff0-9a34-859b497605a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.031571 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10db39-8540-4ff0-9a34-859b497605a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.175461 4546 generic.go:334] "Generic (PLEG): container finished" podID="ea10db39-8540-4ff0-9a34-859b497605a9" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" exitCode=0 Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.175523 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-66c6d5d4cd-sncfn" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.175784 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66c6d5d4cd-sncfn" event={"ID":"ea10db39-8540-4ff0-9a34-859b497605a9","Type":"ContainerDied","Data":"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2"} Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.178023 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-66c6d5d4cd-sncfn" event={"ID":"ea10db39-8540-4ff0-9a34-859b497605a9","Type":"ContainerDied","Data":"25bf796d0a21ce13e558b8b2665e113fcc75bcb0448e08897fe02bc22429ae22"} Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.178066 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.178092 4546 scope.go:117] "RemoveContainer" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.177909 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="sg-core" containerID="cri-o://4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c" gracePeriod=30 Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.177938 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-notification-agent" containerID="cri-o://dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a" gracePeriod=30 Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.177850 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-central-agent" containerID="cri-o://f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41" gracePeriod=30 Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.178103 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerStarted","Data":"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25"} Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.177898 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="proxy-httpd" containerID="cri-o://7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25" gracePeriod=30 Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.219370 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.235294065 podStartE2EDuration="8.219349913s" podCreationTimestamp="2026-02-01 06:59:03 +0000 UTC" firstStartedPulling="2026-02-01 06:59:05.643110053 +0000 UTC m=+976.294046069" lastFinishedPulling="2026-02-01 06:59:10.6271659 +0000 UTC m=+981.278101917" observedRunningTime="2026-02-01 06:59:11.201347777 +0000 UTC m=+981.852283793" watchObservedRunningTime="2026-02-01 06:59:11.219349913 +0000 UTC m=+981.870285929" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.245462 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.250066 4546 scope.go:117] "RemoveContainer" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" Feb 01 06:59:11 crc kubenswrapper[4546]: E0201 06:59:11.250589 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2\": container with ID starting with ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2 not found: ID does not exist" containerID="ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.250696 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2"} err="failed to get container status \"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2\": rpc error: code = NotFound desc = could not find container \"ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2\": container with ID starting with ba8ade31aff01f80d6545a0abc862e694f33669020542c63bebe4281f7268ce2 not found: ID does not exist" Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.253240 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-66c6d5d4cd-sncfn"] Feb 01 06:59:11 crc kubenswrapper[4546]: I0201 06:59:11.666185 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" path="/var/lib/kubelet/pods/ea10db39-8540-4ff0-9a34-859b497605a9/volumes" Feb 01 06:59:12 crc kubenswrapper[4546]: I0201 06:59:12.194156 4546 generic.go:334] "Generic (PLEG): container finished" podID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerID="4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c" exitCode=2 Feb 01 06:59:12 crc kubenswrapper[4546]: I0201 06:59:12.194198 4546 generic.go:334] "Generic (PLEG): container finished" podID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerID="dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a" exitCode=0 Feb 01 06:59:12 crc kubenswrapper[4546]: I0201 06:59:12.194224 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerDied","Data":"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c"} Feb 01 06:59:12 crc kubenswrapper[4546]: I0201 06:59:12.194258 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerDied","Data":"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a"} Feb 01 06:59:13 crc kubenswrapper[4546]: I0201 06:59:13.209975 4546 generic.go:334] "Generic (PLEG): container finished" podID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerID="f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41" exitCode=0 Feb 01 06:59:13 crc kubenswrapper[4546]: I0201 06:59:13.209982 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerDied","Data":"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41"} Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.799781 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fvmdz"] Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800593 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" containerName="heat-engine" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800607 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" containerName="heat-engine" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800617 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab34a556-843f-4e9a-becd-82452d0ad83d" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800623 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab34a556-843f-4e9a-becd-82452d0ad83d" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800633 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800638 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800654 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60f534b-2c84-4054-99c1-c682e0a58c7f" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800659 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60f534b-2c84-4054-99c1-c682e0a58c7f" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800669 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e59900e-a73b-4d2c-be24-130f43e15f6d" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800684 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e59900e-a73b-4d2c-be24-130f43e15f6d" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800697 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ce9cfb-87da-40b6-9676-492ba3cce8b6" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800702 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ce9cfb-87da-40b6-9676-492ba3cce8b6" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: E0201 06:59:14.800713 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77705152-25fc-47d3-b448-00144a74f075" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800718 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="77705152-25fc-47d3-b448-00144a74f075" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800902 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea10db39-8540-4ff0-9a34-859b497605a9" containerName="heat-engine" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800928 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800938 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="77705152-25fc-47d3-b448-00144a74f075" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800946 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e59900e-a73b-4d2c-be24-130f43e15f6d" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800957 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab34a556-843f-4e9a-becd-82452d0ad83d" containerName="mariadb-database-create" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800965 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ce9cfb-87da-40b6-9676-492ba3cce8b6" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.800972 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60f534b-2c84-4054-99c1-c682e0a58c7f" containerName="mariadb-account-create-update" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.801507 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.805558 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.805797 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q7jdh" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.805964 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.807047 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.807188 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.807217 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.807328 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4q4f\" (UniqueName: \"kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.828000 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fvmdz"] Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.910158 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4q4f\" (UniqueName: \"kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.910248 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.910338 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.910357 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.921364 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.927282 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.927449 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:14 crc kubenswrapper[4546]: I0201 06:59:14.930332 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4q4f\" (UniqueName: \"kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f\") pod \"nova-cell0-conductor-db-sync-fvmdz\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:15 crc kubenswrapper[4546]: I0201 06:59:15.116422 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:15 crc kubenswrapper[4546]: I0201 06:59:15.713916 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fvmdz"] Feb 01 06:59:16 crc kubenswrapper[4546]: I0201 06:59:16.248279 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" event={"ID":"69960492-73f4-4adf-94c6-f8f6ea237503","Type":"ContainerStarted","Data":"9f2186f21536b83c1a91f898a7f0e103e898c1e250afb99efafba61eaa8389f2"} Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.333431 4546 generic.go:334] "Generic (PLEG): container finished" podID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerID="1defa9cde24bd7fa205a1d242ff1e49e8126f438e7db916597ed746325e80d73" exitCode=137 Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.333805 4546 generic.go:334] "Generic (PLEG): container finished" podID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerID="1dc96d1f38420507484550dc5fea604ec7287d0ef1855005f7600d236c468b7c" exitCode=137 Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.333520 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerDied","Data":"1defa9cde24bd7fa205a1d242ff1e49e8126f438e7db916597ed746325e80d73"} Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.333891 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerDied","Data":"1dc96d1f38420507484550dc5fea604ec7287d0ef1855005f7600d236c468b7c"} Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.333923 4546 scope.go:117] "RemoveContainer" containerID="7d86ac28320dfdeffcd7f6de1c9aec106f75400f1752f6450b264050c4e7d9ce" Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.497394 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.497652 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-log" containerID="cri-o://1d756fb963091c578fde353b62a409395523aa03978879c60b95208a39b88ca7" gracePeriod=30 Feb 01 06:59:21 crc kubenswrapper[4546]: I0201 06:59:21.497763 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-httpd" containerID="cri-o://0f49c519ff4bdf72daa941545c8c3591d9ee81d4959f9b934c257411417daebc" gracePeriod=30 Feb 01 06:59:22 crc kubenswrapper[4546]: I0201 06:59:22.279884 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:22 crc kubenswrapper[4546]: I0201 06:59:22.280393 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-log" containerID="cri-o://4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38" gracePeriod=30 Feb 01 06:59:22 crc kubenswrapper[4546]: I0201 06:59:22.280515 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-httpd" containerID="cri-o://3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d" gracePeriod=30 Feb 01 06:59:22 crc kubenswrapper[4546]: I0201 06:59:22.346436 4546 generic.go:334] "Generic (PLEG): container finished" podID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerID="1d756fb963091c578fde353b62a409395523aa03978879c60b95208a39b88ca7" exitCode=143 Feb 01 06:59:22 crc kubenswrapper[4546]: I0201 06:59:22.346484 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerDied","Data":"1d756fb963091c578fde353b62a409395523aa03978879c60b95208a39b88ca7"} Feb 01 06:59:23 crc kubenswrapper[4546]: I0201 06:59:23.360423 4546 generic.go:334] "Generic (PLEG): container finished" podID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerID="4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38" exitCode=143 Feb 01 06:59:23 crc kubenswrapper[4546]: I0201 06:59:23.360928 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerDied","Data":"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38"} Feb 01 06:59:25 crc kubenswrapper[4546]: I0201 06:59:25.378718 4546 generic.go:334] "Generic (PLEG): container finished" podID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerID="0f49c519ff4bdf72daa941545c8c3591d9ee81d4959f9b934c257411417daebc" exitCode=0 Feb 01 06:59:25 crc kubenswrapper[4546]: I0201 06:59:25.378918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerDied","Data":"0f49c519ff4bdf72daa941545c8c3591d9ee81d4959f9b934c257411417daebc"} Feb 01 06:59:25 crc kubenswrapper[4546]: I0201 06:59:25.931440 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.006894 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.092586 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54d5j\" (UniqueName: \"kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.092850 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.092904 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.092937 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.092959 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093012 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093028 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093048 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093081 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093106 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093134 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9j8q\" (UniqueName: \"kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093183 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093220 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093305 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\" (UID: \"8467d399-3ecc-4cb7-83ab-d285f8cdf7de\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.093335 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data\") pod \"42765622-7cd6-4ad8-9917-35e6fccc928d\" (UID: \"42765622-7cd6-4ad8-9917-35e6fccc928d\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.095649 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs" (OuterVolumeSpecName: "logs") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.098174 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.100386 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j" (OuterVolumeSpecName: "kube-api-access-54d5j") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "kube-api-access-54d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.102328 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs" (OuterVolumeSpecName: "logs") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.112040 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts" (OuterVolumeSpecName: "scripts") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.112229 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q" (OuterVolumeSpecName: "kube-api-access-s9j8q") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "kube-api-access-s9j8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.112445 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.130034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.164189 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.164945 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data" (OuterVolumeSpecName: "config-data") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.174500 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.180803 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.181186 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts" (OuterVolumeSpecName: "scripts") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196179 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196235 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196249 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42765622-7cd6-4ad8-9917-35e6fccc928d-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196259 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54d5j\" (UniqueName: \"kubernetes.io/projected/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-kube-api-access-54d5j\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196272 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196280 4546 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196289 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196298 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196306 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196314 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196322 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42765622-7cd6-4ad8-9917-35e6fccc928d-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196329 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.196339 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9j8q\" (UniqueName: \"kubernetes.io/projected/42765622-7cd6-4ad8-9917-35e6fccc928d-kube-api-access-s9j8q\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.214761 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.221223 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "42765622-7cd6-4ad8-9917-35e6fccc928d" (UID: "42765622-7cd6-4ad8-9917-35e6fccc928d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.248820 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data" (OuterVolumeSpecName: "config-data") pod "8467d399-3ecc-4cb7-83ab-d285f8cdf7de" (UID: "8467d399-3ecc-4cb7-83ab-d285f8cdf7de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.299130 4546 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/42765622-7cd6-4ad8-9917-35e6fccc928d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.299165 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.299178 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8467d399-3ecc-4cb7-83ab-d285f8cdf7de-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.316817 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.390636 4546 generic.go:334] "Generic (PLEG): container finished" podID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerID="3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d" exitCode=0 Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.390673 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.390755 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerDied","Data":"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d"} Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.390838 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff7f7e42-647e-4e25-a3a8-32c23eeb9277","Type":"ContainerDied","Data":"46156c13f831e3b09731f5fa785c1b941464337a17d4ac2efff54d4466769ee8"} Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.390940 4546 scope.go:117] "RemoveContainer" containerID="3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.394810 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.395037 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8467d399-3ecc-4cb7-83ab-d285f8cdf7de","Type":"ContainerDied","Data":"5e8a0e938938b72764834474a3c17268e5f6b604915fac9dd0578d698071ff70"} Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.399971 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400128 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400237 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5852\" (UniqueName: \"kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400302 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400437 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400629 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400731 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.400824 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle\") pod \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\" (UID: \"ff7f7e42-647e-4e25-a3a8-32c23eeb9277\") " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.402491 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs" (OuterVolumeSpecName: "logs") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.402750 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.403771 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts" (OuterVolumeSpecName: "scripts") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.406281 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c8bd8cd6b-vfjlr" event={"ID":"42765622-7cd6-4ad8-9917-35e6fccc928d","Type":"ContainerDied","Data":"f632170be0c9e01aec485bb30ecfa4234b4d9ca9d3f5f2c42f7a045df77cc580"} Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.406362 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c8bd8cd6b-vfjlr" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.409357 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852" (OuterVolumeSpecName: "kube-api-access-f5852") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "kube-api-access-f5852". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.409520 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.410572 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.415546 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.418060 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.422762 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5852\" (UniqueName: \"kubernetes.io/projected/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-kube-api-access-f5852\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.423075 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.421579 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" event={"ID":"69960492-73f4-4adf-94c6-f8f6ea237503","Type":"ContainerStarted","Data":"832c4e5e70e268faf40ffc5ca3cca6f3a014fbe7ee29f7231121bb26bb237701"} Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.448523 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.453119 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.456065 4546 scope.go:117] "RemoveContainer" containerID="4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.484007 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.502747 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.502899 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503369 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503388 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503402 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503425 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503439 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503444 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon-log" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503465 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503470 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503487 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503493 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503503 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503510 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.503522 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503528 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503749 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503760 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503770 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503781 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503792 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon-log" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.503801 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" containerName="glance-httpd" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.504218 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" containerName="horizon" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.504850 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.508403 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.510347 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.512177 4546 scope.go:117] "RemoveContainer" containerID="3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.514219 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d\": container with ID starting with 3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d not found: ID does not exist" containerID="3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.514265 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d"} err="failed to get container status \"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d\": rpc error: code = NotFound desc = could not find container \"3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d\": container with ID starting with 3277153268bebbc0baf908b252f9cfabd4b390bfdaa72d4e68630f5993b65a9d not found: ID does not exist" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.514290 4546 scope.go:117] "RemoveContainer" containerID="4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38" Feb 01 06:59:26 crc kubenswrapper[4546]: E0201 06:59:26.518017 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38\": container with ID starting with 4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38 not found: ID does not exist" containerID="4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.518061 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38"} err="failed to get container status \"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38\": rpc error: code = NotFound desc = could not find container \"4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38\": container with ID starting with 4d51fb8a29af7596d00de88192682337f1effaad8ae706318501e12585adde38 not found: ID does not exist" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.518091 4546 scope.go:117] "RemoveContainer" containerID="0f49c519ff4bdf72daa941545c8c3591d9ee81d4959f9b934c257411417daebc" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.526899 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2gg\" (UniqueName: \"kubernetes.io/projected/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-kube-api-access-gh2gg\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.526999 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527019 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527138 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527264 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527415 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527518 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-logs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527544 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527607 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.527617 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.529335 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data" (OuterVolumeSpecName: "config-data") pod "ff7f7e42-647e-4e25-a3a8-32c23eeb9277" (UID: "ff7f7e42-647e-4e25-a3a8-32c23eeb9277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.530261 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.537965 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.543890 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c8bd8cd6b-vfjlr"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.547758 4546 scope.go:117] "RemoveContainer" containerID="1d756fb963091c578fde353b62a409395523aa03978879c60b95208a39b88ca7" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.551055 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.552416 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" podStartSLOduration=2.3461483530000002 podStartE2EDuration="12.552407209s" podCreationTimestamp="2026-02-01 06:59:14 +0000 UTC" firstStartedPulling="2026-02-01 06:59:15.729145977 +0000 UTC m=+986.380081993" lastFinishedPulling="2026-02-01 06:59:25.935404842 +0000 UTC m=+996.586340849" observedRunningTime="2026-02-01 06:59:26.491145835 +0000 UTC m=+997.142081851" watchObservedRunningTime="2026-02-01 06:59:26.552407209 +0000 UTC m=+997.203343214" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.564278 4546 scope.go:117] "RemoveContainer" containerID="1defa9cde24bd7fa205a1d242ff1e49e8126f438e7db916597ed746325e80d73" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631329 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631419 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-logs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631454 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631486 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2gg\" (UniqueName: \"kubernetes.io/projected/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-kube-api-access-gh2gg\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631520 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631538 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631560 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631584 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631633 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631717 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.631749 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7f7e42-647e-4e25-a3a8-32c23eeb9277-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.632593 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-logs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.632873 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.635191 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.637030 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.637769 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.639421 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.647537 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2gg\" (UniqueName: \"kubernetes.io/projected/aa3b7e16-fa4a-411a-a8cc-64dd573b71af-kube-api-access-gh2gg\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.679300 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"aa3b7e16-fa4a-411a-a8cc-64dd573b71af\") " pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.739711 4546 scope.go:117] "RemoveContainer" containerID="1dc96d1f38420507484550dc5fea604ec7287d0ef1855005f7600d236c468b7c" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.749394 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.761573 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.771995 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.773707 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.778069 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.778249 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.800124 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.826455 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.937798 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938034 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938070 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938107 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2wf\" (UniqueName: \"kubernetes.io/projected/5ad007a5-ef1e-4768-8370-ac8473a042ff-kube-api-access-kb2wf\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938132 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938214 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938234 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:26 crc kubenswrapper[4546]: I0201 06:59:26.938251 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040390 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040441 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040520 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040581 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2wf\" (UniqueName: \"kubernetes.io/projected/5ad007a5-ef1e-4768-8370-ac8473a042ff-kube-api-access-kb2wf\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040623 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040708 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040736 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.040759 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.041317 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.041742 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ad007a5-ef1e-4768-8370-ac8473a042ff-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.043478 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.047180 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.047610 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.048256 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.051423 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ad007a5-ef1e-4768-8370-ac8473a042ff-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.064794 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2wf\" (UniqueName: \"kubernetes.io/projected/5ad007a5-ef1e-4768-8370-ac8473a042ff-kube-api-access-kb2wf\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.101225 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5ad007a5-ef1e-4768-8370-ac8473a042ff\") " pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.394601 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.581566 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.677085 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42765622-7cd6-4ad8-9917-35e6fccc928d" path="/var/lib/kubelet/pods/42765622-7cd6-4ad8-9917-35e6fccc928d/volumes" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.683677 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8467d399-3ecc-4cb7-83ab-d285f8cdf7de" path="/var/lib/kubelet/pods/8467d399-3ecc-4cb7-83ab-d285f8cdf7de/volumes" Feb 01 06:59:27 crc kubenswrapper[4546]: I0201 06:59:27.684408 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7f7e42-647e-4e25-a3a8-32c23eeb9277" path="/var/lib/kubelet/pods/ff7f7e42-647e-4e25-a3a8-32c23eeb9277/volumes" Feb 01 06:59:28 crc kubenswrapper[4546]: I0201 06:59:28.009033 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 06:59:28 crc kubenswrapper[4546]: I0201 06:59:28.444926 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad007a5-ef1e-4768-8370-ac8473a042ff","Type":"ContainerStarted","Data":"e1b93b3a93d35099dcdf682fd719c3835519978c91277ba1adf70be6bd8aef58"} Feb 01 06:59:28 crc kubenswrapper[4546]: I0201 06:59:28.450686 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa3b7e16-fa4a-411a-a8cc-64dd573b71af","Type":"ContainerStarted","Data":"aa8a4537fb9646d6d715dfd6bce9ee67c0c86e54a18e1d6fa9dc22e66a38a7c7"} Feb 01 06:59:28 crc kubenswrapper[4546]: I0201 06:59:28.450742 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa3b7e16-fa4a-411a-a8cc-64dd573b71af","Type":"ContainerStarted","Data":"3304fe584fe163753d4efefd4382613e93d5a6c807207582d46ee1149636b2a6"} Feb 01 06:59:29 crc kubenswrapper[4546]: I0201 06:59:29.463727 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad007a5-ef1e-4768-8370-ac8473a042ff","Type":"ContainerStarted","Data":"9010c408d9203e7c1a6d6fd653c537113863e088061262baf19b875b39b53f39"} Feb 01 06:59:29 crc kubenswrapper[4546]: I0201 06:59:29.464140 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ad007a5-ef1e-4768-8370-ac8473a042ff","Type":"ContainerStarted","Data":"1ad733ec6c07e55e5daf947be5844012cb11b1f318d89311301c2d83505def13"} Feb 01 06:59:29 crc kubenswrapper[4546]: I0201 06:59:29.466297 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"aa3b7e16-fa4a-411a-a8cc-64dd573b71af","Type":"ContainerStarted","Data":"9770221c8a41d0cff67d3e47818bc3a09cc270b2d7a0fedb2c49bef8edd0eae9"} Feb 01 06:59:29 crc kubenswrapper[4546]: I0201 06:59:29.493103 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.493089478 podStartE2EDuration="3.493089478s" podCreationTimestamp="2026-02-01 06:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:29.48530832 +0000 UTC m=+1000.136244335" watchObservedRunningTime="2026-02-01 06:59:29.493089478 +0000 UTC m=+1000.144025494" Feb 01 06:59:29 crc kubenswrapper[4546]: I0201 06:59:29.518548 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.518528152 podStartE2EDuration="3.518528152s" podCreationTimestamp="2026-02-01 06:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:29.513601324 +0000 UTC m=+1000.164537341" watchObservedRunningTime="2026-02-01 06:59:29.518528152 +0000 UTC m=+1000.169464168" Feb 01 06:59:34 crc kubenswrapper[4546]: I0201 06:59:34.481929 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 01 06:59:34 crc kubenswrapper[4546]: I0201 06:59:34.524739 4546 generic.go:334] "Generic (PLEG): container finished" podID="69960492-73f4-4adf-94c6-f8f6ea237503" containerID="832c4e5e70e268faf40ffc5ca3cca6f3a014fbe7ee29f7231121bb26bb237701" exitCode=0 Feb 01 06:59:34 crc kubenswrapper[4546]: I0201 06:59:34.524801 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" event={"ID":"69960492-73f4-4adf-94c6-f8f6ea237503","Type":"ContainerDied","Data":"832c4e5e70e268faf40ffc5ca3cca6f3a014fbe7ee29f7231121bb26bb237701"} Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.871735 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.975137 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle\") pod \"69960492-73f4-4adf-94c6-f8f6ea237503\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.975296 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data\") pod \"69960492-73f4-4adf-94c6-f8f6ea237503\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.975397 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4q4f\" (UniqueName: \"kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f\") pod \"69960492-73f4-4adf-94c6-f8f6ea237503\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.975470 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts\") pod \"69960492-73f4-4adf-94c6-f8f6ea237503\" (UID: \"69960492-73f4-4adf-94c6-f8f6ea237503\") " Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.982815 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f" (OuterVolumeSpecName: "kube-api-access-j4q4f") pod "69960492-73f4-4adf-94c6-f8f6ea237503" (UID: "69960492-73f4-4adf-94c6-f8f6ea237503"). InnerVolumeSpecName "kube-api-access-j4q4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:35 crc kubenswrapper[4546]: I0201 06:59:35.984187 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts" (OuterVolumeSpecName: "scripts") pod "69960492-73f4-4adf-94c6-f8f6ea237503" (UID: "69960492-73f4-4adf-94c6-f8f6ea237503"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.004212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69960492-73f4-4adf-94c6-f8f6ea237503" (UID: "69960492-73f4-4adf-94c6-f8f6ea237503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.005023 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data" (OuterVolumeSpecName: "config-data") pod "69960492-73f4-4adf-94c6-f8f6ea237503" (UID: "69960492-73f4-4adf-94c6-f8f6ea237503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.077873 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.078205 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.078263 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4q4f\" (UniqueName: \"kubernetes.io/projected/69960492-73f4-4adf-94c6-f8f6ea237503-kube-api-access-j4q4f\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.078320 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69960492-73f4-4adf-94c6-f8f6ea237503-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.548372 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" event={"ID":"69960492-73f4-4adf-94c6-f8f6ea237503","Type":"ContainerDied","Data":"9f2186f21536b83c1a91f898a7f0e103e898c1e250afb99efafba61eaa8389f2"} Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.548433 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2186f21536b83c1a91f898a7f0e103e898c1e250afb99efafba61eaa8389f2" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.548558 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fvmdz" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.675848 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:36 crc kubenswrapper[4546]: E0201 06:59:36.676379 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69960492-73f4-4adf-94c6-f8f6ea237503" containerName="nova-cell0-conductor-db-sync" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.676401 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="69960492-73f4-4adf-94c6-f8f6ea237503" containerName="nova-cell0-conductor-db-sync" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.676666 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="69960492-73f4-4adf-94c6-f8f6ea237503" containerName="nova-cell0-conductor-db-sync" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.678371 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.681181 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.681730 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q7jdh" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.696020 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.792456 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.792493 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.792723 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xb2\" (UniqueName: \"kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.827235 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.827398 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.857501 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.864507 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.894698 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xb2\" (UniqueName: \"kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.894890 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.894915 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.902934 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.903574 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.913466 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xb2\" (UniqueName: \"kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2\") pod \"nova-cell0-conductor-0\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:36 crc kubenswrapper[4546]: I0201 06:59:36.997308 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.395138 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.395473 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.428637 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.432529 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.559540 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.559602 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.559617 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.559628 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 06:59:37 crc kubenswrapper[4546]: I0201 06:59:37.627906 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:38 crc kubenswrapper[4546]: I0201 06:59:38.569804 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eff99989-51ad-47e8-99e0-0d7adc1ad550","Type":"ContainerStarted","Data":"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76"} Feb 01 06:59:38 crc kubenswrapper[4546]: I0201 06:59:38.570111 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eff99989-51ad-47e8-99e0-0d7adc1ad550","Type":"ContainerStarted","Data":"38211dc5f0aa1edd97ca38c7e656807b9a4894633c8e5fbd3850e414ff13e15e"} Feb 01 06:59:38 crc kubenswrapper[4546]: I0201 06:59:38.571156 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.095118 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.095102814 podStartE2EDuration="3.095102814s" podCreationTimestamp="2026-02-01 06:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:38.595161198 +0000 UTC m=+1009.246097214" watchObservedRunningTime="2026-02-01 06:59:39.095102814 +0000 UTC m=+1009.746038830" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.100453 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.536502 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.537387 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.562846 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.576209 4546 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:59:39 crc kubenswrapper[4546]: I0201 06:59:39.649443 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 06:59:40 crc kubenswrapper[4546]: I0201 06:59:40.585469 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="eff99989-51ad-47e8-99e0-0d7adc1ad550" containerName="nova-cell0-conductor-conductor" containerID="cri-o://90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76" gracePeriod=30 Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.442873 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.509583 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2xb2\" (UniqueName: \"kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2\") pod \"eff99989-51ad-47e8-99e0-0d7adc1ad550\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.509634 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data\") pod \"eff99989-51ad-47e8-99e0-0d7adc1ad550\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.509879 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle\") pod \"eff99989-51ad-47e8-99e0-0d7adc1ad550\" (UID: \"eff99989-51ad-47e8-99e0-0d7adc1ad550\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.534025 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2" (OuterVolumeSpecName: "kube-api-access-t2xb2") pod "eff99989-51ad-47e8-99e0-0d7adc1ad550" (UID: "eff99989-51ad-47e8-99e0-0d7adc1ad550"). InnerVolumeSpecName "kube-api-access-t2xb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.556429 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff99989-51ad-47e8-99e0-0d7adc1ad550" (UID: "eff99989-51ad-47e8-99e0-0d7adc1ad550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.569739 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data" (OuterVolumeSpecName: "config-data") pod "eff99989-51ad-47e8-99e0-0d7adc1ad550" (UID: "eff99989-51ad-47e8-99e0-0d7adc1ad550"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.587359 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616121 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616166 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616217 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616272 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppx8t\" (UniqueName: \"kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616435 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616501 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.616627 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml\") pod \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\" (UID: \"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be\") " Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.617143 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2xb2\" (UniqueName: \"kubernetes.io/projected/eff99989-51ad-47e8-99e0-0d7adc1ad550-kube-api-access-t2xb2\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.617155 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.617166 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff99989-51ad-47e8-99e0-0d7adc1ad550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.620531 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.620791 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.629444 4546 generic.go:334] "Generic (PLEG): container finished" podID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerID="7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25" exitCode=137 Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.629587 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerDied","Data":"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25"} Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.629623 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cfe42e-7c8a-42ee-bba0-883cc5bef7be","Type":"ContainerDied","Data":"a79221df8f0a8efb355b65be309b092e31aff0d2b5ddb4b095ad5eb175bb741c"} Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.629991 4546 scope.go:117] "RemoveContainer" containerID="7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.630501 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.637155 4546 generic.go:334] "Generic (PLEG): container finished" podID="eff99989-51ad-47e8-99e0-0d7adc1ad550" containerID="90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76" exitCode=0 Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.637216 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eff99989-51ad-47e8-99e0-0d7adc1ad550","Type":"ContainerDied","Data":"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76"} Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.637258 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eff99989-51ad-47e8-99e0-0d7adc1ad550","Type":"ContainerDied","Data":"38211dc5f0aa1edd97ca38c7e656807b9a4894633c8e5fbd3850e414ff13e15e"} Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.637332 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.637393 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t" (OuterVolumeSpecName: "kube-api-access-ppx8t") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "kube-api-access-ppx8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.640297 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts" (OuterVolumeSpecName: "scripts") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.669533 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.675627 4546 scope.go:117] "RemoveContainer" containerID="4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.689591 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.713719 4546 scope.go:117] "RemoveContainer" containerID="dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.721103 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.726981 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppx8t\" (UniqueName: \"kubernetes.io/projected/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-kube-api-access-ppx8t\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.729728 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.729824 4546 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.729906 4546 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.729976 4546 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733055 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.733485 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-central-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733544 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-central-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.733595 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="proxy-httpd" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733640 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="proxy-httpd" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.733694 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="sg-core" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733757 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="sg-core" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.733818 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff99989-51ad-47e8-99e0-0d7adc1ad550" containerName="nova-cell0-conductor-conductor" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733897 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff99989-51ad-47e8-99e0-0d7adc1ad550" containerName="nova-cell0-conductor-conductor" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.733948 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-notification-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.733989 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-notification-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.734178 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="proxy-httpd" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.736396 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="sg-core" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.736493 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-central-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.736584 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" containerName="ceilometer-notification-agent" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.736670 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff99989-51ad-47e8-99e0-0d7adc1ad550" containerName="nova-cell0-conductor-conductor" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.737938 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.741006 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.741940 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q7jdh" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.744828 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.754748 4546 scope.go:117] "RemoveContainer" containerID="f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.775606 4546 scope.go:117] "RemoveContainer" containerID="7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.775961 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25\": container with ID starting with 7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25 not found: ID does not exist" containerID="7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.775996 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25"} err="failed to get container status \"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25\": rpc error: code = NotFound desc = could not find container \"7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25\": container with ID starting with 7ea86261beb2d5fd712d38717bd72556c1f886e5e194eea8b6c0b93788c05e25 not found: ID does not exist" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.776042 4546 scope.go:117] "RemoveContainer" containerID="4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.776041 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.776282 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c\": container with ID starting with 4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c not found: ID does not exist" containerID="4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.776398 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c"} err="failed to get container status \"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c\": rpc error: code = NotFound desc = could not find container \"4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c\": container with ID starting with 4ea249f6cbff0fe762b6d06ad6badbe36274c95a848c57b7ef0e53cf59c6c42c not found: ID does not exist" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.776480 4546 scope.go:117] "RemoveContainer" containerID="dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.776998 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a\": container with ID starting with dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a not found: ID does not exist" containerID="dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.777031 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a"} err="failed to get container status \"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a\": rpc error: code = NotFound desc = could not find container \"dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a\": container with ID starting with dc6418438719872de41868366f6776f7d4ed726ee3f50d33930be62e29e1fb5a not found: ID does not exist" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.777049 4546 scope.go:117] "RemoveContainer" containerID="f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.777415 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41\": container with ID starting with f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41 not found: ID does not exist" containerID="f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.777447 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41"} err="failed to get container status \"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41\": rpc error: code = NotFound desc = could not find container \"f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41\": container with ID starting with f338ed14089a20fc661e74a5fd0da42d5ed760ec4d6193f6b114b05bdabf8f41 not found: ID does not exist" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.777470 4546 scope.go:117] "RemoveContainer" containerID="90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.809434 4546 scope.go:117] "RemoveContainer" containerID="90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76" Feb 01 06:59:41 crc kubenswrapper[4546]: E0201 06:59:41.809819 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76\": container with ID starting with 90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76 not found: ID does not exist" containerID="90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.809887 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76"} err="failed to get container status \"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76\": rpc error: code = NotFound desc = could not find container \"90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76\": container with ID starting with 90183968768ba13a821c7db3a03d891fe481ffc7237dad01df1a1682b484ef76 not found: ID does not exist" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.817451 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data" (OuterVolumeSpecName: "config-data") pod "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" (UID: "f7cfe42e-7c8a-42ee-bba0-883cc5bef7be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.831761 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.831836 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86mq\" (UniqueName: \"kubernetes.io/projected/a80f4f73-ef7a-473c-bcf6-669f57644741-kube-api-access-t86mq\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.832007 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.832147 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.832160 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.933604 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.933742 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.933796 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86mq\" (UniqueName: \"kubernetes.io/projected/a80f4f73-ef7a-473c-bcf6-669f57644741-kube-api-access-t86mq\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.939611 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.940076 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80f4f73-ef7a-473c-bcf6-669f57644741-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.952697 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86mq\" (UniqueName: \"kubernetes.io/projected/a80f4f73-ef7a-473c-bcf6-669f57644741-kube-api-access-t86mq\") pod \"nova-cell0-conductor-0\" (UID: \"a80f4f73-ef7a-473c-bcf6-669f57644741\") " pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.963899 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.973044 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.993206 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:41 crc kubenswrapper[4546]: I0201 06:59:41.995394 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.000527 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.000705 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.008047 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.036960 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.037288 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.037409 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.037529 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.037643 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nh7\" (UniqueName: \"kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.037799 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.038025 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.055983 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.143718 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144012 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144055 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144080 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nh7\" (UniqueName: \"kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144120 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144190 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144242 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144256 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.144528 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.154165 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.154598 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.154835 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.157521 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.180560 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nh7\" (UniqueName: \"kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7\") pod \"ceilometer-0\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.310671 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.515463 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 06:59:42 crc kubenswrapper[4546]: W0201 06:59:42.515764 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda80f4f73_ef7a_473c_bcf6_669f57644741.slice/crio-80156e30b9519e8b418b6f22a60afe5e343df5c8cd0bc48a8ffff804f3f4ab0a WatchSource:0}: Error finding container 80156e30b9519e8b418b6f22a60afe5e343df5c8cd0bc48a8ffff804f3f4ab0a: Status 404 returned error can't find the container with id 80156e30b9519e8b418b6f22a60afe5e343df5c8cd0bc48a8ffff804f3f4ab0a Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.660257 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a80f4f73-ef7a-473c-bcf6-669f57644741","Type":"ContainerStarted","Data":"80156e30b9519e8b418b6f22a60afe5e343df5c8cd0bc48a8ffff804f3f4ab0a"} Feb 01 06:59:42 crc kubenswrapper[4546]: I0201 06:59:42.744251 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 06:59:42 crc kubenswrapper[4546]: W0201 06:59:42.744516 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76aad09c_7fb0_499f_afca_a553aab90ad1.slice/crio-7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557 WatchSource:0}: Error finding container 7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557: Status 404 returned error can't find the container with id 7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557 Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.663217 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff99989-51ad-47e8-99e0-0d7adc1ad550" path="/var/lib/kubelet/pods/eff99989-51ad-47e8-99e0-0d7adc1ad550/volumes" Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.664033 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cfe42e-7c8a-42ee-bba0-883cc5bef7be" path="/var/lib/kubelet/pods/f7cfe42e-7c8a-42ee-bba0-883cc5bef7be/volumes" Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.685709 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a80f4f73-ef7a-473c-bcf6-669f57644741","Type":"ContainerStarted","Data":"4d3488784c7aaa4cbae123cdbf9ccb9b474a38c717f8059eeacd0b29362b6cc1"} Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.685833 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.687089 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerStarted","Data":"fefa4df7d106b61293016f12588d6abd82428ce3e1b6fcab0480e7103410c527"} Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.687161 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerStarted","Data":"7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557"} Feb 01 06:59:43 crc kubenswrapper[4546]: I0201 06:59:43.708090 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.708076567 podStartE2EDuration="2.708076567s" podCreationTimestamp="2026-02-01 06:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:43.705202981 +0000 UTC m=+1014.356138997" watchObservedRunningTime="2026-02-01 06:59:43.708076567 +0000 UTC m=+1014.359012584" Feb 01 06:59:44 crc kubenswrapper[4546]: I0201 06:59:44.705026 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerStarted","Data":"c80b0bfc435be5a41c60880558336e3317d85bdb64376dc5b6efb4ac623bf962"} Feb 01 06:59:45 crc kubenswrapper[4546]: I0201 06:59:45.718482 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerStarted","Data":"e25dec80ca4d32fdfd281d21bd27a1e13d4c3fde6ce743d6799812758b279e79"} Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.089491 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.630661 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rq7cx"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.632348 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.634136 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.636572 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.671937 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rq7cx"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.675514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.675696 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfwlx\" (UniqueName: \"kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.675785 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.675901 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.772337 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerStarted","Data":"bc86699e7d9725585b5f3f4777d648363be52b2a67d6ef9144d0e9e168eb67fe"} Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.772641 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.778101 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfwlx\" (UniqueName: \"kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.778179 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.778268 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.778345 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.791767 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.791824 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.799400 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.815474 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.347615445 podStartE2EDuration="6.815451391s" podCreationTimestamp="2026-02-01 06:59:41 +0000 UTC" firstStartedPulling="2026-02-01 06:59:42.749207711 +0000 UTC m=+1013.400143727" lastFinishedPulling="2026-02-01 06:59:47.217043657 +0000 UTC m=+1017.867979673" observedRunningTime="2026-02-01 06:59:47.798228914 +0000 UTC m=+1018.449164920" watchObservedRunningTime="2026-02-01 06:59:47.815451391 +0000 UTC m=+1018.466387407" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.844769 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.846208 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.852060 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.857966 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfwlx\" (UniqueName: \"kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx\") pod \"nova-cell0-cell-mapping-rq7cx\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.880743 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.880833 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.880893 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989d6\" (UniqueName: \"kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.899731 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.901646 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.905350 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.919822 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.947915 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.950447 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988618 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988664 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn49g\" (UniqueName: \"kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988700 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988725 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989d6\" (UniqueName: \"kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988815 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.988923 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:47 crc kubenswrapper[4546]: I0201 06:59:47.995811 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.002767 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.015679 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989d6\" (UniqueName: \"kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6\") pod \"nova-cell1-novncproxy-0\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.024005 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.025411 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.029750 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.055619 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.090438 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091232 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091321 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091398 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d75\" (UniqueName: \"kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091491 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091573 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.091638 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn49g\" (UniqueName: \"kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.096558 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.096919 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.104049 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.104516 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.105622 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.107626 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.129694 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn49g\" (UniqueName: \"kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g\") pod \"nova-api-0\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.168112 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.183343 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.192653 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.192755 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.192993 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.193046 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwt6r\" (UniqueName: \"kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.193072 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.193088 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d75\" (UniqueName: \"kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.193131 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.200565 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.211825 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.211908 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.215180 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.221827 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.253198 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.260983 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d75\" (UniqueName: \"kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75\") pod \"nova-scheduler-0\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295019 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwt6r\" (UniqueName: \"kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295125 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295188 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295246 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295264 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295286 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295306 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295327 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9d5n\" (UniqueName: \"kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295348 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.295367 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.308738 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.309347 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.315338 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.322010 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwt6r\" (UniqueName: \"kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r\") pod \"nova-metadata-0\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.349160 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.407220 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.458551 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.458602 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.458645 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9d5n\" (UniqueName: \"kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.458684 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.458713 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.459644 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.421895 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.460186 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.460479 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.422585 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.475751 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.486455 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9d5n\" (UniqueName: \"kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n\") pod \"dnsmasq-dns-664f5cdb7c-j8rfz\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.759492 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:48 crc kubenswrapper[4546]: W0201 06:59:48.844221 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fdf5e3f_6e33_4f70_95e1_c54b7c97df47.slice/crio-40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a WatchSource:0}: Error finding container 40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a: Status 404 returned error can't find the container with id 40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a Feb 01 06:59:48 crc kubenswrapper[4546]: I0201 06:59:48.902200 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rq7cx"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.087559 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.380141 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 06:59:49 crc kubenswrapper[4546]: W0201 06:59:49.380196 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode44d36d3_89ac_4873_8913_a3a0c6faa798.slice/crio-b30020868fc179ed4e1413a86e04333ce02608f076ed004ac2d01ae39d2e32d2 WatchSource:0}: Error finding container b30020868fc179ed4e1413a86e04333ce02608f076ed004ac2d01ae39d2e32d2: Status 404 returned error can't find the container with id b30020868fc179ed4e1413a86e04333ce02608f076ed004ac2d01ae39d2e32d2 Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.568608 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.617683 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.810237 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerStarted","Data":"698c2db5689182771ba98ae2f59a83f8dccbc7bfa54252dce413e97ca1c9bcd4"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.816086 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a28f2e9-b910-48ce-a0d7-d97b27478c9a","Type":"ContainerStarted","Data":"06b3432cab63c943348bf0d7b502bc9fc93b58d0d838bd92a8b992e5a8c1fec9"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.818563 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.822473 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e44d36d3-89ac-4873-8913-a3a0c6faa798","Type":"ContainerStarted","Data":"b30020868fc179ed4e1413a86e04333ce02608f076ed004ac2d01ae39d2e32d2"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.834159 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rq7cx" event={"ID":"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47","Type":"ContainerStarted","Data":"704b1cf06fa9bd035f9f48c831d5894e0f89d194a217f6bffff62f09614f62ce"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.834202 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rq7cx" event={"ID":"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47","Type":"ContainerStarted","Data":"40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.848540 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" event={"ID":"43650776-3c2d-4c00-b082-55e3c3a9dce3","Type":"ContainerStarted","Data":"4311883920c5757173a30b5a8ab485929f68167d253ad9b34ec57c50a2c32633"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.852978 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerStarted","Data":"da1b0cb577d0638904c398d2c35ca97f700c68f2996f68a5cbf6fbf494af21f0"} Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.870648 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k6q2g"] Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.872169 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.875990 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.876174 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.910969 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rq7cx" podStartSLOduration=2.910954909 podStartE2EDuration="2.910954909s" podCreationTimestamp="2026-02-01 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:49.870633199 +0000 UTC m=+1020.521569215" watchObservedRunningTime="2026-02-01 06:59:49.910954909 +0000 UTC m=+1020.561890925" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.931507 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szd7k\" (UniqueName: \"kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.931598 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.931636 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.931668 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:49 crc kubenswrapper[4546]: I0201 06:59:49.932346 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k6q2g"] Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.038063 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.038136 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.038169 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.040470 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szd7k\" (UniqueName: \"kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.050459 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.061958 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.062522 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szd7k\" (UniqueName: \"kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.063434 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k6q2g\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.206655 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.816206 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k6q2g"] Feb 01 06:59:50 crc kubenswrapper[4546]: W0201 06:59:50.835262 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b02eabc_af33_4e6e_8e03_e95876644ea7.slice/crio-3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca WatchSource:0}: Error finding container 3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca: Status 404 returned error can't find the container with id 3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.947272 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" event={"ID":"4b02eabc-af33-4e6e-8e03-e95876644ea7","Type":"ContainerStarted","Data":"3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca"} Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.951175 4546 generic.go:334] "Generic (PLEG): container finished" podID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerID="172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482" exitCode=0 Feb 01 06:59:50 crc kubenswrapper[4546]: I0201 06:59:50.953675 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" event={"ID":"43650776-3c2d-4c00-b082-55e3c3a9dce3","Type":"ContainerDied","Data":"172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482"} Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.795701 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.802076 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.963327 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" event={"ID":"4b02eabc-af33-4e6e-8e03-e95876644ea7","Type":"ContainerStarted","Data":"dbe98116b2536d914d6e7edcd3966feaffeb60826ac1a7318f5adcbb87b511b9"} Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.971402 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" event={"ID":"43650776-3c2d-4c00-b082-55e3c3a9dce3","Type":"ContainerStarted","Data":"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8"} Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.971582 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.982446 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" podStartSLOduration=2.98242962 podStartE2EDuration="2.98242962s" podCreationTimestamp="2026-02-01 06:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:51.975544622 +0000 UTC m=+1022.626480637" watchObservedRunningTime="2026-02-01 06:59:51.98242962 +0000 UTC m=+1022.633365637" Feb 01 06:59:51 crc kubenswrapper[4546]: I0201 06:59:51.992182 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" podStartSLOduration=3.992165845 podStartE2EDuration="3.992165845s" podCreationTimestamp="2026-02-01 06:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:51.990182176 +0000 UTC m=+1022.641118181" watchObservedRunningTime="2026-02-01 06:59:51.992165845 +0000 UTC m=+1022.643101861" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.007681 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerStarted","Data":"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.009131 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerStarted","Data":"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.011462 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerStarted","Data":"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.011518 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerStarted","Data":"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.011778 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-log" containerID="cri-o://c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" gracePeriod=30 Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.011826 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-metadata" containerID="cri-o://41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" gracePeriod=30 Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.014718 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a28f2e9-b910-48ce-a0d7-d97b27478c9a","Type":"ContainerStarted","Data":"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.022420 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e44d36d3-89ac-4873-8913-a3a0c6faa798","Type":"ContainerStarted","Data":"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5"} Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.022825 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e44d36d3-89ac-4873-8913-a3a0c6faa798" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5" gracePeriod=30 Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.041043 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.41666485 podStartE2EDuration="8.041032632s" podCreationTimestamp="2026-02-01 06:59:47 +0000 UTC" firstStartedPulling="2026-02-01 06:59:49.097295595 +0000 UTC m=+1019.748231611" lastFinishedPulling="2026-02-01 06:59:53.721663378 +0000 UTC m=+1024.372599393" observedRunningTime="2026-02-01 06:59:55.034419716 +0000 UTC m=+1025.685355732" watchObservedRunningTime="2026-02-01 06:59:55.041032632 +0000 UTC m=+1025.691968647" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.056438 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.922731721 podStartE2EDuration="8.056430718s" podCreationTimestamp="2026-02-01 06:59:47 +0000 UTC" firstStartedPulling="2026-02-01 06:59:49.583947028 +0000 UTC m=+1020.234883044" lastFinishedPulling="2026-02-01 06:59:53.717646025 +0000 UTC m=+1024.368582041" observedRunningTime="2026-02-01 06:59:55.051015971 +0000 UTC m=+1025.701951987" watchObservedRunningTime="2026-02-01 06:59:55.056430718 +0000 UTC m=+1025.707366735" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.065754 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.729735828 podStartE2EDuration="8.065747181s" podCreationTimestamp="2026-02-01 06:59:47 +0000 UTC" firstStartedPulling="2026-02-01 06:59:49.384621802 +0000 UTC m=+1020.035557818" lastFinishedPulling="2026-02-01 06:59:53.720633156 +0000 UTC m=+1024.371569171" observedRunningTime="2026-02-01 06:59:55.064657216 +0000 UTC m=+1025.715593232" watchObservedRunningTime="2026-02-01 06:59:55.065747181 +0000 UTC m=+1025.716683196" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.093945 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.158561073 podStartE2EDuration="7.093917064s" podCreationTimestamp="2026-02-01 06:59:48 +0000 UTC" firstStartedPulling="2026-02-01 06:59:49.785261445 +0000 UTC m=+1020.436197451" lastFinishedPulling="2026-02-01 06:59:53.720617436 +0000 UTC m=+1024.371553442" observedRunningTime="2026-02-01 06:59:55.080696772 +0000 UTC m=+1025.731632789" watchObservedRunningTime="2026-02-01 06:59:55.093917064 +0000 UTC m=+1025.744853080" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.659042 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.788701 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwt6r\" (UniqueName: \"kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r\") pod \"989fd16d-08e8-4ccf-b02c-10c40e692324\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.788827 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle\") pod \"989fd16d-08e8-4ccf-b02c-10c40e692324\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.788988 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data\") pod \"989fd16d-08e8-4ccf-b02c-10c40e692324\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.789005 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs\") pod \"989fd16d-08e8-4ccf-b02c-10c40e692324\" (UID: \"989fd16d-08e8-4ccf-b02c-10c40e692324\") " Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.790460 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs" (OuterVolumeSpecName: "logs") pod "989fd16d-08e8-4ccf-b02c-10c40e692324" (UID: "989fd16d-08e8-4ccf-b02c-10c40e692324"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.796163 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r" (OuterVolumeSpecName: "kube-api-access-dwt6r") pod "989fd16d-08e8-4ccf-b02c-10c40e692324" (UID: "989fd16d-08e8-4ccf-b02c-10c40e692324"). InnerVolumeSpecName "kube-api-access-dwt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.817108 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "989fd16d-08e8-4ccf-b02c-10c40e692324" (UID: "989fd16d-08e8-4ccf-b02c-10c40e692324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.835973 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data" (OuterVolumeSpecName: "config-data") pod "989fd16d-08e8-4ccf-b02c-10c40e692324" (UID: "989fd16d-08e8-4ccf-b02c-10c40e692324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.892145 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwt6r\" (UniqueName: \"kubernetes.io/projected/989fd16d-08e8-4ccf-b02c-10c40e692324-kube-api-access-dwt6r\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.892200 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.892211 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/989fd16d-08e8-4ccf-b02c-10c40e692324-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:55 crc kubenswrapper[4546]: I0201 06:59:55.892222 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/989fd16d-08e8-4ccf-b02c-10c40e692324-logs\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.035427 4546 generic.go:334] "Generic (PLEG): container finished" podID="4b02eabc-af33-4e6e-8e03-e95876644ea7" containerID="dbe98116b2536d914d6e7edcd3966feaffeb60826ac1a7318f5adcbb87b511b9" exitCode=0 Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.035537 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" event={"ID":"4b02eabc-af33-4e6e-8e03-e95876644ea7","Type":"ContainerDied","Data":"dbe98116b2536d914d6e7edcd3966feaffeb60826ac1a7318f5adcbb87b511b9"} Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.038034 4546 generic.go:334] "Generic (PLEG): container finished" podID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerID="41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" exitCode=0 Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.038072 4546 generic.go:334] "Generic (PLEG): container finished" podID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerID="c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" exitCode=143 Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.039049 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.039151 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerDied","Data":"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6"} Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.039278 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerDied","Data":"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873"} Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.039305 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"989fd16d-08e8-4ccf-b02c-10c40e692324","Type":"ContainerDied","Data":"698c2db5689182771ba98ae2f59a83f8dccbc7bfa54252dce413e97ca1c9bcd4"} Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.039331 4546 scope.go:117] "RemoveContainer" containerID="41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.086248 4546 scope.go:117] "RemoveContainer" containerID="c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.104797 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.118619 4546 scope.go:117] "RemoveContainer" containerID="41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" Feb 01 06:59:56 crc kubenswrapper[4546]: E0201 06:59:56.119144 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6\": container with ID starting with 41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6 not found: ID does not exist" containerID="41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.119186 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6"} err="failed to get container status \"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6\": rpc error: code = NotFound desc = could not find container \"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6\": container with ID starting with 41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6 not found: ID does not exist" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.119210 4546 scope.go:117] "RemoveContainer" containerID="c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.119303 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:56 crc kubenswrapper[4546]: E0201 06:59:56.121108 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873\": container with ID starting with c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873 not found: ID does not exist" containerID="c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.121151 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873"} err="failed to get container status \"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873\": rpc error: code = NotFound desc = could not find container \"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873\": container with ID starting with c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873 not found: ID does not exist" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.121170 4546 scope.go:117] "RemoveContainer" containerID="41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.124060 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6"} err="failed to get container status \"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6\": rpc error: code = NotFound desc = could not find container \"41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6\": container with ID starting with 41bc971c049c71776bf4bc579be82e62352459e8c952285290d143221aa835c6 not found: ID does not exist" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.124085 4546 scope.go:117] "RemoveContainer" containerID="c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.125452 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873"} err="failed to get container status \"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873\": rpc error: code = NotFound desc = could not find container \"c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873\": container with ID starting with c5611fdf0b16c0903c3d3425e67d453db4b816e5d4b118fa4d8a56cb9cdf1873 not found: ID does not exist" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.144895 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:56 crc kubenswrapper[4546]: E0201 06:59:56.145409 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-log" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.145430 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-log" Feb 01 06:59:56 crc kubenswrapper[4546]: E0201 06:59:56.145442 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-metadata" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.145448 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-metadata" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.145635 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-metadata" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.145649 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" containerName="nova-metadata-log" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.146662 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.149334 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.150680 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.154023 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.198790 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.198839 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88jk\" (UniqueName: \"kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.198882 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.199168 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.199345 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.302030 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.302282 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.303126 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88jk\" (UniqueName: \"kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.303177 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.303268 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.303699 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.307821 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.314423 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.314560 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.327247 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88jk\" (UniqueName: \"kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk\") pod \"nova-metadata-0\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.467283 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 06:59:56 crc kubenswrapper[4546]: I0201 06:59:56.870839 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.050810 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerStarted","Data":"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6"} Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.051073 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerStarted","Data":"c61d6aac1110bb30a6e6744137ccf7951dc19ae08fb52db80b8a95d4abd36d74"} Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.356624 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.433298 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szd7k\" (UniqueName: \"kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k\") pod \"4b02eabc-af33-4e6e-8e03-e95876644ea7\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.433779 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle\") pod \"4b02eabc-af33-4e6e-8e03-e95876644ea7\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.433822 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data\") pod \"4b02eabc-af33-4e6e-8e03-e95876644ea7\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.433881 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts\") pod \"4b02eabc-af33-4e6e-8e03-e95876644ea7\" (UID: \"4b02eabc-af33-4e6e-8e03-e95876644ea7\") " Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.438830 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k" (OuterVolumeSpecName: "kube-api-access-szd7k") pod "4b02eabc-af33-4e6e-8e03-e95876644ea7" (UID: "4b02eabc-af33-4e6e-8e03-e95876644ea7"). InnerVolumeSpecName "kube-api-access-szd7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.438965 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts" (OuterVolumeSpecName: "scripts") pod "4b02eabc-af33-4e6e-8e03-e95876644ea7" (UID: "4b02eabc-af33-4e6e-8e03-e95876644ea7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.456271 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b02eabc-af33-4e6e-8e03-e95876644ea7" (UID: "4b02eabc-af33-4e6e-8e03-e95876644ea7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.457780 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data" (OuterVolumeSpecName: "config-data") pod "4b02eabc-af33-4e6e-8e03-e95876644ea7" (UID: "4b02eabc-af33-4e6e-8e03-e95876644ea7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.536524 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szd7k\" (UniqueName: \"kubernetes.io/projected/4b02eabc-af33-4e6e-8e03-e95876644ea7-kube-api-access-szd7k\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.536548 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.536559 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.536569 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b02eabc-af33-4e6e-8e03-e95876644ea7-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:57 crc kubenswrapper[4546]: I0201 06:59:57.668275 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989fd16d-08e8-4ccf-b02c-10c40e692324" path="/var/lib/kubelet/pods/989fd16d-08e8-4ccf-b02c-10c40e692324/volumes" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.067121 4546 generic.go:334] "Generic (PLEG): container finished" podID="8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" containerID="704b1cf06fa9bd035f9f48c831d5894e0f89d194a217f6bffff62f09614f62ce" exitCode=0 Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.067359 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rq7cx" event={"ID":"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47","Type":"ContainerDied","Data":"704b1cf06fa9bd035f9f48c831d5894e0f89d194a217f6bffff62f09614f62ce"} Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.073161 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" event={"ID":"4b02eabc-af33-4e6e-8e03-e95876644ea7","Type":"ContainerDied","Data":"3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca"} Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.073262 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3059cd600ec716ec005641e7e04c5ebecd11ac3805b04e33c72ca9436f3aacca" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.073381 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k6q2g" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.079100 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerStarted","Data":"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48"} Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.129705 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.129688775 podStartE2EDuration="2.129688775s" podCreationTimestamp="2026-02-01 06:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:59:58.124130738 +0000 UTC m=+1028.775066754" watchObservedRunningTime="2026-02-01 06:59:58.129688775 +0000 UTC m=+1028.780624791" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.158726 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 06:59:58 crc kubenswrapper[4546]: E0201 06:59:58.159312 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b02eabc-af33-4e6e-8e03-e95876644ea7" containerName="nova-cell1-conductor-db-sync" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.159333 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b02eabc-af33-4e6e-8e03-e95876644ea7" containerName="nova-cell1-conductor-db-sync" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.159602 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b02eabc-af33-4e6e-8e03-e95876644ea7" containerName="nova-cell1-conductor-db-sync" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.160285 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.168288 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.168616 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.185311 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.216028 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.216085 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.251769 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9792\" (UniqueName: \"kubernetes.io/projected/5ebb4a9e-f44c-433d-950c-568436111388-kube-api-access-c9792\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.252205 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.252278 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.349923 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.349993 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.354271 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.354316 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.354412 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9792\" (UniqueName: \"kubernetes.io/projected/5ebb4a9e-f44c-433d-950c-568436111388-kube-api-access-c9792\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.377072 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.384750 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9792\" (UniqueName: \"kubernetes.io/projected/5ebb4a9e-f44c-433d-950c-568436111388-kube-api-access-c9792\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.385630 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebb4a9e-f44c-433d-950c-568436111388-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5ebb4a9e-f44c-433d-950c-568436111388\") " pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.386585 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.480552 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.762056 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.841555 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.841830 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="dnsmasq-dns" containerID="cri-o://85282ebf44055d02053ca601be216e4ae8e1c8b7a1b5dc69caffdb56b55e2400" gracePeriod=10 Feb 01 06:59:58 crc kubenswrapper[4546]: I0201 06:59:58.991197 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.123607 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5ebb4a9e-f44c-433d-950c-568436111388","Type":"ContainerStarted","Data":"3168da9d60afbe2f5c8e536897fef33a66264abae5e666b66a03b21d1a239b87"} Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.127762 4546 generic.go:334] "Generic (PLEG): container finished" podID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerID="85282ebf44055d02053ca601be216e4ae8e1c8b7a1b5dc69caffdb56b55e2400" exitCode=0 Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.128543 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" event={"ID":"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1","Type":"ContainerDied","Data":"85282ebf44055d02053ca601be216e4ae8e1c8b7a1b5dc69caffdb56b55e2400"} Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.184670 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.299703 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.300031 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.445493 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492044 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492302 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492536 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492580 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492615 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.492721 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dpjk\" (UniqueName: \"kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk\") pod \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\" (UID: \"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.522014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk" (OuterVolumeSpecName: "kube-api-access-5dpjk") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "kube-api-access-5dpjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.547701 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.595395 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts\") pod \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.595583 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data\") pod \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.595648 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle\") pod \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.596272 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfwlx\" (UniqueName: \"kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx\") pod \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\" (UID: \"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47\") " Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.597302 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dpjk\" (UniqueName: \"kubernetes.io/projected/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-kube-api-access-5dpjk\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.611561 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx" (OuterVolumeSpecName: "kube-api-access-mfwlx") pod "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" (UID: "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47"). InnerVolumeSpecName "kube-api-access-mfwlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.618010 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts" (OuterVolumeSpecName: "scripts") pod "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" (UID: "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.652136 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.654974 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data" (OuterVolumeSpecName: "config-data") pod "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" (UID: "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.661864 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" (UID: "8fdf5e3f-6e33-4f70-95e1-c54b7c97df47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.679378 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.689333 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.694746 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700288 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfwlx\" (UniqueName: \"kubernetes.io/projected/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-kube-api-access-mfwlx\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700315 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700326 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700339 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700351 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700360 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700370 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.700379 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.711460 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config" (OuterVolumeSpecName: "config") pod "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" (UID: "ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:59:59 crc kubenswrapper[4546]: I0201 06:59:59.801365 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.148058 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" event={"ID":"ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1","Type":"ContainerDied","Data":"5550bc567837eb2f0bb0f129e985a10ba4e034f0538fba7c5d42604f8d3c8521"} Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.148097 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.148115 4546 scope.go:117] "RemoveContainer" containerID="85282ebf44055d02053ca601be216e4ae8e1c8b7a1b5dc69caffdb56b55e2400" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.151590 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rq7cx" event={"ID":"8fdf5e3f-6e33-4f70-95e1-c54b7c97df47","Type":"ContainerDied","Data":"40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a"} Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.151635 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40948c87c5fa94d106364777cb4e467a54ba5708043a8acf28e1a8c38cb70f3a" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.152423 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rq7cx" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.167309 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5ebb4a9e-f44c-433d-950c-568436111388","Type":"ContainerStarted","Data":"3a82d14e088346eac5d9208103a9b23b8a8d6360e3212849b079c54cf00b77f9"} Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.168155 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.174963 4546 scope.go:117] "RemoveContainer" containerID="4cc7281af917c039e5d2941c69756a0cdf60fb972929b5d2467215059aaac380" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.189291 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.194082 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b84f76f59-qvk5p"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.235321 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq"] Feb 01 07:00:00 crc kubenswrapper[4546]: E0201 07:00:00.236111 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="dnsmasq-dns" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.236131 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="dnsmasq-dns" Feb 01 07:00:00 crc kubenswrapper[4546]: E0201 07:00:00.236160 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="init" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.236167 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="init" Feb 01 07:00:00 crc kubenswrapper[4546]: E0201 07:00:00.236182 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" containerName="nova-manage" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.236188 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" containerName="nova-manage" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.236379 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="dnsmasq-dns" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.236417 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" containerName="nova-manage" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.237101 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.238928 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.238907638 podStartE2EDuration="2.238907638s" podCreationTimestamp="2026-02-01 06:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:00.212468768 +0000 UTC m=+1030.863404784" watchObservedRunningTime="2026-02-01 07:00:00.238907638 +0000 UTC m=+1030.889843654" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.240255 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.240466 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.262832 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.317798 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.324754 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.324822 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgt8s\" (UniqueName: \"kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.398945 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.399218 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-log" containerID="cri-o://20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e" gracePeriod=30 Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.399681 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-api" containerID="cri-o://7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a" gracePeriod=30 Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.421040 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.427318 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.427353 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.427376 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgt8s\" (UniqueName: \"kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.428670 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.431145 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.431448 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-log" containerID="cri-o://0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" gracePeriod=30 Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.431821 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-metadata" containerID="cri-o://39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" gracePeriod=30 Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.442400 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.447411 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgt8s\" (UniqueName: \"kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s\") pod \"collect-profiles-29498820-982tq\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:00 crc kubenswrapper[4546]: I0201 07:00:00.594669 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.074204 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.152158 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h88jk\" (UniqueName: \"kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk\") pod \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.152247 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs\") pod \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.152323 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle\") pod \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.152370 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs\") pod \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.152413 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data\") pod \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\" (UID: \"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08\") " Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.168183 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk" (OuterVolumeSpecName: "kube-api-access-h88jk") pod "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" (UID: "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08"). InnerVolumeSpecName "kube-api-access-h88jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.168940 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs" (OuterVolumeSpecName: "logs") pod "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" (UID: "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.179538 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" (UID: "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.195928 4546 generic.go:334] "Generic (PLEG): container finished" podID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerID="20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e" exitCode=143 Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.196105 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerDied","Data":"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e"} Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.198990 4546 generic.go:334] "Generic (PLEG): container finished" podID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerID="39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" exitCode=0 Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199027 4546 generic.go:334] "Generic (PLEG): container finished" podID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerID="0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" exitCode=143 Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199085 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerDied","Data":"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48"} Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199117 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerDied","Data":"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6"} Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199134 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08","Type":"ContainerDied","Data":"c61d6aac1110bb30a6e6744137ccf7951dc19ae08fb52db80b8a95d4abd36d74"} Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199155 4546 scope.go:117] "RemoveContainer" containerID="39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.199333 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.204460 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerName="nova-scheduler-scheduler" containerID="cri-o://e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" gracePeriod=30 Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.217233 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" (UID: "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.255407 4546 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.255457 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.255558 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.256181 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h88jk\" (UniqueName: \"kubernetes.io/projected/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-kube-api-access-h88jk\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.286465 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data" (OuterVolumeSpecName: "config-data") pod "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" (UID: "bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.309211 4546 scope.go:117] "RemoveContainer" containerID="0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.327991 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq"] Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.352778 4546 scope.go:117] "RemoveContainer" containerID="39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" Feb 01 07:00:01 crc kubenswrapper[4546]: E0201 07:00:01.353212 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48\": container with ID starting with 39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48 not found: ID does not exist" containerID="39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353265 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48"} err="failed to get container status \"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48\": rpc error: code = NotFound desc = could not find container \"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48\": container with ID starting with 39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48 not found: ID does not exist" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353287 4546 scope.go:117] "RemoveContainer" containerID="0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" Feb 01 07:00:01 crc kubenswrapper[4546]: E0201 07:00:01.353563 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6\": container with ID starting with 0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6 not found: ID does not exist" containerID="0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353602 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6"} err="failed to get container status \"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6\": rpc error: code = NotFound desc = could not find container \"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6\": container with ID starting with 0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6 not found: ID does not exist" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353619 4546 scope.go:117] "RemoveContainer" containerID="39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353850 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48"} err="failed to get container status \"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48\": rpc error: code = NotFound desc = could not find container \"39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48\": container with ID starting with 39fc64be47f52b76c0ec530534f93aeeed669a7f61e4312550fd19542f8e6e48 not found: ID does not exist" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.353932 4546 scope.go:117] "RemoveContainer" containerID="0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.354125 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6"} err="failed to get container status \"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6\": rpc error: code = NotFound desc = could not find container \"0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6\": container with ID starting with 0819f5c0134728702e61849e4f8ce931fafde1b696752dd6465857c951fa00d6 not found: ID does not exist" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.358764 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.530728 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.547152 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.591491 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:01 crc kubenswrapper[4546]: E0201 07:00:01.591936 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-log" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.591954 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-log" Feb 01 07:00:01 crc kubenswrapper[4546]: E0201 07:00:01.591967 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-metadata" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.591975 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-metadata" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.592129 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-metadata" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.592153 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" containerName="nova-metadata-log" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.595669 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.597320 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.599616 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.634311 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.666084 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtr7w\" (UniqueName: \"kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.666132 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.666177 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.666194 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.666258 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.673134 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08" path="/var/lib/kubelet/pods/bd0dab2e-00ef-4ea0-8f49-cf39d16d0d08/volumes" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.673679 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" path="/var/lib/kubelet/pods/ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1/volumes" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.768605 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.769005 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtr7w\" (UniqueName: \"kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.769049 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.769100 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.769116 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.769406 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.774102 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.776218 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.776508 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.784810 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtr7w\" (UniqueName: \"kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w\") pod \"nova-metadata-0\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " pod="openstack/nova-metadata-0" Feb 01 07:00:01 crc kubenswrapper[4546]: I0201 07:00:01.914530 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:02 crc kubenswrapper[4546]: I0201 07:00:02.211914 4546 generic.go:334] "Generic (PLEG): container finished" podID="6b38979b-e35d-4fa3-a515-1e91fb6bf310" containerID="3923f276a66accbb6ef12ed7810189d9749e2378bbbc2af011b77abe62099391" exitCode=0 Feb 01 07:00:02 crc kubenswrapper[4546]: I0201 07:00:02.212123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" event={"ID":"6b38979b-e35d-4fa3-a515-1e91fb6bf310","Type":"ContainerDied","Data":"3923f276a66accbb6ef12ed7810189d9749e2378bbbc2af011b77abe62099391"} Feb 01 07:00:02 crc kubenswrapper[4546]: I0201 07:00:02.212329 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" event={"ID":"6b38979b-e35d-4fa3-a515-1e91fb6bf310","Type":"ContainerStarted","Data":"6be41f3ae44579fde358565fcc64b3030f474627a63eaddec4dde07514279763"} Feb 01 07:00:02 crc kubenswrapper[4546]: I0201 07:00:02.381407 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:02 crc kubenswrapper[4546]: W0201 07:00:02.386159 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e47d60a_0b20_4ebb_8ac8_bfbd33e312af.slice/crio-cb1791c989372e568365a6d1b209be72e778e32b9f9ee5c804a72a612d236119 WatchSource:0}: Error finding container cb1791c989372e568365a6d1b209be72e778e32b9f9ee5c804a72a612d236119: Status 404 returned error can't find the container with id cb1791c989372e568365a6d1b209be72e778e32b9f9ee5c804a72a612d236119 Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.227997 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerStarted","Data":"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b"} Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.228275 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerStarted","Data":"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c"} Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.228292 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerStarted","Data":"cb1791c989372e568365a6d1b209be72e778e32b9f9ee5c804a72a612d236119"} Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.259193 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.259170505 podStartE2EDuration="2.259170505s" podCreationTimestamp="2026-02-01 07:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:03.250705056 +0000 UTC m=+1033.901641072" watchObservedRunningTime="2026-02-01 07:00:03.259170505 +0000 UTC m=+1033.910106520" Feb 01 07:00:03 crc kubenswrapper[4546]: E0201 07:00:03.357207 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:03 crc kubenswrapper[4546]: E0201 07:00:03.359445 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:03 crc kubenswrapper[4546]: E0201 07:00:03.360669 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:03 crc kubenswrapper[4546]: E0201 07:00:03.360756 4546 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerName="nova-scheduler-scheduler" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.537899 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.618814 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume\") pod \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.618902 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgt8s\" (UniqueName: \"kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s\") pod \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.619086 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume\") pod \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\" (UID: \"6b38979b-e35d-4fa3-a515-1e91fb6bf310\") " Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.619962 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b38979b-e35d-4fa3-a515-1e91fb6bf310" (UID: "6b38979b-e35d-4fa3-a515-1e91fb6bf310"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.620169 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b38979b-e35d-4fa3-a515-1e91fb6bf310-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.624168 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s" (OuterVolumeSpecName: "kube-api-access-lgt8s") pod "6b38979b-e35d-4fa3-a515-1e91fb6bf310" (UID: "6b38979b-e35d-4fa3-a515-1e91fb6bf310"). InnerVolumeSpecName "kube-api-access-lgt8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.624207 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b38979b-e35d-4fa3-a515-1e91fb6bf310" (UID: "6b38979b-e35d-4fa3-a515-1e91fb6bf310"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.722455 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b38979b-e35d-4fa3-a515-1e91fb6bf310-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:03 crc kubenswrapper[4546]: I0201 07:00:03.722496 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgt8s\" (UniqueName: \"kubernetes.io/projected/6b38979b-e35d-4fa3-a515-1e91fb6bf310-kube-api-access-lgt8s\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:04 crc kubenswrapper[4546]: I0201 07:00:04.174111 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b84f76f59-qvk5p" podUID="ed323bc5-e1f8-4472-87c9-cfa65bbdcbe1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: i/o timeout" Feb 01 07:00:04 crc kubenswrapper[4546]: I0201 07:00:04.251788 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" Feb 01 07:00:04 crc kubenswrapper[4546]: I0201 07:00:04.252666 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq" event={"ID":"6b38979b-e35d-4fa3-a515-1e91fb6bf310","Type":"ContainerDied","Data":"6be41f3ae44579fde358565fcc64b3030f474627a63eaddec4dde07514279763"} Feb 01 07:00:04 crc kubenswrapper[4546]: I0201 07:00:04.252711 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6be41f3ae44579fde358565fcc64b3030f474627a63eaddec4dde07514279763" Feb 01 07:00:05 crc kubenswrapper[4546]: I0201 07:00:05.912523 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:05 crc kubenswrapper[4546]: I0201 07:00:05.971239 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2d75\" (UniqueName: \"kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75\") pod \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " Feb 01 07:00:05 crc kubenswrapper[4546]: I0201 07:00:05.972172 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle\") pod \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " Feb 01 07:00:05 crc kubenswrapper[4546]: I0201 07:00:05.972197 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data\") pod \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\" (UID: \"6a28f2e9-b910-48ce-a0d7-d97b27478c9a\") " Feb 01 07:00:05 crc kubenswrapper[4546]: I0201 07:00:05.978669 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75" (OuterVolumeSpecName: "kube-api-access-s2d75") pod "6a28f2e9-b910-48ce-a0d7-d97b27478c9a" (UID: "6a28f2e9-b910-48ce-a0d7-d97b27478c9a"). InnerVolumeSpecName "kube-api-access-s2d75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.014682 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data" (OuterVolumeSpecName: "config-data") pod "6a28f2e9-b910-48ce-a0d7-d97b27478c9a" (UID: "6a28f2e9-b910-48ce-a0d7-d97b27478c9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.029297 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a28f2e9-b910-48ce-a0d7-d97b27478c9a" (UID: "6a28f2e9-b910-48ce-a0d7-d97b27478c9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.076073 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2d75\" (UniqueName: \"kubernetes.io/projected/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-kube-api-access-s2d75\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.076330 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.076343 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a28f2e9-b910-48ce-a0d7-d97b27478c9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.099377 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.177953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn49g\" (UniqueName: \"kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g\") pod \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.178295 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs\") pod \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.178534 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle\") pod \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.178586 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data\") pod \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\" (UID: \"efd7b7ae-e4ee-45fd-865b-732ec58a4c69\") " Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.179017 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs" (OuterVolumeSpecName: "logs") pod "efd7b7ae-e4ee-45fd-865b-732ec58a4c69" (UID: "efd7b7ae-e4ee-45fd-865b-732ec58a4c69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.179707 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.183027 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g" (OuterVolumeSpecName: "kube-api-access-nn49g") pod "efd7b7ae-e4ee-45fd-865b-732ec58a4c69" (UID: "efd7b7ae-e4ee-45fd-865b-732ec58a4c69"). InnerVolumeSpecName "kube-api-access-nn49g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.204254 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data" (OuterVolumeSpecName: "config-data") pod "efd7b7ae-e4ee-45fd-865b-732ec58a4c69" (UID: "efd7b7ae-e4ee-45fd-865b-732ec58a4c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.206806 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efd7b7ae-e4ee-45fd-865b-732ec58a4c69" (UID: "efd7b7ae-e4ee-45fd-865b-732ec58a4c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.277777 4546 generic.go:334] "Generic (PLEG): container finished" podID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerID="7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a" exitCode=0 Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.277938 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerDied","Data":"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a"} Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.278023 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"efd7b7ae-e4ee-45fd-865b-732ec58a4c69","Type":"ContainerDied","Data":"da1b0cb577d0638904c398d2c35ca97f700c68f2996f68a5cbf6fbf494af21f0"} Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.278070 4546 scope.go:117] "RemoveContainer" containerID="7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.278314 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.281043 4546 generic.go:334] "Generic (PLEG): container finished" podID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" exitCode=0 Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.281075 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a28f2e9-b910-48ce-a0d7-d97b27478c9a","Type":"ContainerDied","Data":"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89"} Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.281190 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a28f2e9-b910-48ce-a0d7-d97b27478c9a","Type":"ContainerDied","Data":"06b3432cab63c943348bf0d7b502bc9fc93b58d0d838bd92a8b992e5a8c1fec9"} Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.281249 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.281754 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn49g\" (UniqueName: \"kubernetes.io/projected/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-kube-api-access-nn49g\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.282168 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.284832 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd7b7ae-e4ee-45fd-865b-732ec58a4c69-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.311489 4546 scope.go:117] "RemoveContainer" containerID="20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.323353 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.330948 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.337327 4546 scope.go:117] "RemoveContainer" containerID="7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.337427 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.338330 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a\": container with ID starting with 7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a not found: ID does not exist" containerID="7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.338365 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a"} err="failed to get container status \"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a\": rpc error: code = NotFound desc = could not find container \"7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a\": container with ID starting with 7ef11c80f8e7ee6e1e6743e54b0979dfbc573dbc0f971a699ffd99000638893a not found: ID does not exist" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.338389 4546 scope.go:117] "RemoveContainer" containerID="20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e" Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.338813 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e\": container with ID starting with 20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e not found: ID does not exist" containerID="20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.338849 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e"} err="failed to get container status \"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e\": rpc error: code = NotFound desc = could not find container \"20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e\": container with ID starting with 20ad39e6a7891f9f53650320b735c4c771013044313295359c2575f6b89dfa0e not found: ID does not exist" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.338915 4546 scope.go:117] "RemoveContainer" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.341434 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.346732 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.347144 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-api" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347209 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-api" Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.347239 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b38979b-e35d-4fa3-a515-1e91fb6bf310" containerName="collect-profiles" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347245 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b38979b-e35d-4fa3-a515-1e91fb6bf310" containerName="collect-profiles" Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.347267 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-log" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347273 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-log" Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.347290 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerName="nova-scheduler-scheduler" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347298 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerName="nova-scheduler-scheduler" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347465 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-api" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347486 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" containerName="nova-scheduler-scheduler" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347501 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" containerName="nova-api-log" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.347509 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b38979b-e35d-4fa3-a515-1e91fb6bf310" containerName="collect-profiles" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.348097 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.351921 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.370635 4546 scope.go:117] "RemoveContainer" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" Feb 01 07:00:06 crc kubenswrapper[4546]: E0201 07:00:06.372581 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89\": container with ID starting with e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89 not found: ID does not exist" containerID="e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.372614 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89"} err="failed to get container status \"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89\": rpc error: code = NotFound desc = could not find container \"e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89\": container with ID starting with e44f33f2efd0d12063ed1df8124b0bb78a98c7653336002d4105ffe737ba7d89 not found: ID does not exist" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.380022 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.381958 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.385214 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.386783 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.387110 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbxfs\" (UniqueName: \"kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.387299 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.403100 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.421341 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.489874 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490050 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490142 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trftr\" (UniqueName: \"kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490248 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbxfs\" (UniqueName: \"kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490290 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490369 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.490448 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.495900 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.497634 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.530639 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbxfs\" (UniqueName: \"kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs\") pod \"nova-scheduler-0\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.593223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.593304 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trftr\" (UniqueName: \"kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.593375 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.593442 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.595429 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.597979 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.600634 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.615314 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trftr\" (UniqueName: \"kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr\") pod \"nova-api-0\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.669720 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.700436 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.915418 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:00:06 crc kubenswrapper[4546]: I0201 07:00:06.915877 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.145492 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:07 crc kubenswrapper[4546]: W0201 07:00:07.150046 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c54e3bc_bf1b_4a51_946d_be6858436839.slice/crio-cc5aeedece70386c63982a832186a55b8d142769bfd7bcfa20fa39088e89618c WatchSource:0}: Error finding container cc5aeedece70386c63982a832186a55b8d142769bfd7bcfa20fa39088e89618c: Status 404 returned error can't find the container with id cc5aeedece70386c63982a832186a55b8d142769bfd7bcfa20fa39088e89618c Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.198045 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:07 crc kubenswrapper[4546]: W0201 07:00:07.200203 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacd562a1_1119_4464_8887_bc606ef8cef4.slice/crio-ec3e032a8148c477075edfcc958007ac0c328ca38481ba96eeb0d6f146bb300e WatchSource:0}: Error finding container ec3e032a8148c477075edfcc958007ac0c328ca38481ba96eeb0d6f146bb300e: Status 404 returned error can't find the container with id ec3e032a8148c477075edfcc958007ac0c328ca38481ba96eeb0d6f146bb300e Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.290246 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerStarted","Data":"ec3e032a8148c477075edfcc958007ac0c328ca38481ba96eeb0d6f146bb300e"} Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.296884 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c54e3bc-bf1b-4a51-946d-be6858436839","Type":"ContainerStarted","Data":"cc5aeedece70386c63982a832186a55b8d142769bfd7bcfa20fa39088e89618c"} Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.318635 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.318617819 podStartE2EDuration="1.318617819s" podCreationTimestamp="2026-02-01 07:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:07.315215897 +0000 UTC m=+1037.966151913" watchObservedRunningTime="2026-02-01 07:00:07.318617819 +0000 UTC m=+1037.969553835" Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.668800 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a28f2e9-b910-48ce-a0d7-d97b27478c9a" path="/var/lib/kubelet/pods/6a28f2e9-b910-48ce-a0d7-d97b27478c9a/volumes" Feb 01 07:00:07 crc kubenswrapper[4546]: I0201 07:00:07.669422 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd7b7ae-e4ee-45fd-865b-732ec58a4c69" path="/var/lib/kubelet/pods/efd7b7ae-e4ee-45fd-865b-732ec58a4c69/volumes" Feb 01 07:00:08 crc kubenswrapper[4546]: I0201 07:00:08.312727 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c54e3bc-bf1b-4a51-946d-be6858436839","Type":"ContainerStarted","Data":"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971"} Feb 01 07:00:08 crc kubenswrapper[4546]: I0201 07:00:08.316692 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerStarted","Data":"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648"} Feb 01 07:00:08 crc kubenswrapper[4546]: I0201 07:00:08.316774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerStarted","Data":"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95"} Feb 01 07:00:08 crc kubenswrapper[4546]: I0201 07:00:08.344692 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.344673369 podStartE2EDuration="2.344673369s" podCreationTimestamp="2026-02-01 07:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:08.336324341 +0000 UTC m=+1038.987260357" watchObservedRunningTime="2026-02-01 07:00:08.344673369 +0000 UTC m=+1038.995609385" Feb 01 07:00:08 crc kubenswrapper[4546]: I0201 07:00:08.516693 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 01 07:00:11 crc kubenswrapper[4546]: I0201 07:00:11.670232 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 07:00:11 crc kubenswrapper[4546]: I0201 07:00:11.915007 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:00:11 crc kubenswrapper[4546]: I0201 07:00:11.915452 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:00:12 crc kubenswrapper[4546]: I0201 07:00:12.320184 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 07:00:12 crc kubenswrapper[4546]: I0201 07:00:12.932605 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:12 crc kubenswrapper[4546]: I0201 07:00:12.932675 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:15 crc kubenswrapper[4546]: I0201 07:00:15.527526 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:15 crc kubenswrapper[4546]: I0201 07:00:15.528245 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" containerName="kube-state-metrics" containerID="cri-o://1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc" gracePeriod=30 Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.143118 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.216059 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh46z\" (UniqueName: \"kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z\") pod \"fa29cd22-5996-4415-92c9-8012caf2dcfb\" (UID: \"fa29cd22-5996-4415-92c9-8012caf2dcfb\") " Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.223282 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z" (OuterVolumeSpecName: "kube-api-access-lh46z") pod "fa29cd22-5996-4415-92c9-8012caf2dcfb" (UID: "fa29cd22-5996-4415-92c9-8012caf2dcfb"). InnerVolumeSpecName "kube-api-access-lh46z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.318671 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh46z\" (UniqueName: \"kubernetes.io/projected/fa29cd22-5996-4415-92c9-8012caf2dcfb-kube-api-access-lh46z\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.444978 4546 generic.go:334] "Generic (PLEG): container finished" podID="fa29cd22-5996-4415-92c9-8012caf2dcfb" containerID="1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc" exitCode=2 Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.445043 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa29cd22-5996-4415-92c9-8012caf2dcfb","Type":"ContainerDied","Data":"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc"} Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.445083 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fa29cd22-5996-4415-92c9-8012caf2dcfb","Type":"ContainerDied","Data":"848310a9358a98807ffec056ecfe6bd125059678ddaf03ea23ddebcd2e8470c7"} Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.445104 4546 scope.go:117] "RemoveContainer" containerID="1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.445306 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.479950 4546 scope.go:117] "RemoveContainer" containerID="1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc" Feb 01 07:00:16 crc kubenswrapper[4546]: E0201 07:00:16.480421 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc\": container with ID starting with 1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc not found: ID does not exist" containerID="1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.480523 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc"} err="failed to get container status \"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc\": rpc error: code = NotFound desc = could not find container \"1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc\": container with ID starting with 1f84deb035183a71b65247013eb5f1e1e91f32a1f77d6b4edd90717dc5a88edc not found: ID does not exist" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.494632 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.510070 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.517308 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:16 crc kubenswrapper[4546]: E0201 07:00:16.518544 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" containerName="kube-state-metrics" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.518568 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" containerName="kube-state-metrics" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.518788 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" containerName="kube-state-metrics" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.519617 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.522285 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.526343 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.527324 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.625022 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.625390 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.625612 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qptvg\" (UniqueName: \"kubernetes.io/projected/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-api-access-qptvg\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.625768 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.669902 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.695270 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.701242 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.701278 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.727380 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.727450 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qptvg\" (UniqueName: \"kubernetes.io/projected/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-api-access-qptvg\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.727487 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.727576 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.733566 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.735253 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.735340 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.742686 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qptvg\" (UniqueName: \"kubernetes.io/projected/f076ee1b-b564-435a-a66f-b061fbc6c8f3-kube-api-access-qptvg\") pod \"kube-state-metrics-0\" (UID: \"f076ee1b-b564-435a-a66f-b061fbc6c8f3\") " pod="openstack/kube-state-metrics-0" Feb 01 07:00:16 crc kubenswrapper[4546]: I0201 07:00:16.842706 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:00:17 crc kubenswrapper[4546]: W0201 07:00:17.285372 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf076ee1b_b564_435a_a66f_b061fbc6c8f3.slice/crio-de9f2bd7ae29986c1d55d44dc8e7a0ea988ac5b0ac7c501c926fa1fe41183a47 WatchSource:0}: Error finding container de9f2bd7ae29986c1d55d44dc8e7a0ea988ac5b0ac7c501c926fa1fe41183a47: Status 404 returned error can't find the container with id de9f2bd7ae29986c1d55d44dc8e7a0ea988ac5b0ac7c501c926fa1fe41183a47 Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.294346 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.390104 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.390410 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-central-agent" containerID="cri-o://fefa4df7d106b61293016f12588d6abd82428ce3e1b6fcab0480e7103410c527" gracePeriod=30 Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.390469 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="sg-core" containerID="cri-o://e25dec80ca4d32fdfd281d21bd27a1e13d4c3fde6ce743d6799812758b279e79" gracePeriod=30 Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.390489 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="proxy-httpd" containerID="cri-o://bc86699e7d9725585b5f3f4777d648363be52b2a67d6ef9144d0e9e168eb67fe" gracePeriod=30 Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.390585 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-notification-agent" containerID="cri-o://c80b0bfc435be5a41c60880558336e3317d85bdb64376dc5b6efb4ac623bf962" gracePeriod=30 Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.471997 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f076ee1b-b564-435a-a66f-b061fbc6c8f3","Type":"ContainerStarted","Data":"de9f2bd7ae29986c1d55d44dc8e7a0ea988ac5b0ac7c501c926fa1fe41183a47"} Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.523885 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.663473 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa29cd22-5996-4415-92c9-8012caf2dcfb" path="/var/lib/kubelet/pods/fa29cd22-5996-4415-92c9-8012caf2dcfb/volumes" Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.786019 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:17 crc kubenswrapper[4546]: I0201 07:00:17.786805 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.487528 4546 generic.go:334] "Generic (PLEG): container finished" podID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerID="bc86699e7d9725585b5f3f4777d648363be52b2a67d6ef9144d0e9e168eb67fe" exitCode=0 Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.487959 4546 generic.go:334] "Generic (PLEG): container finished" podID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerID="e25dec80ca4d32fdfd281d21bd27a1e13d4c3fde6ce743d6799812758b279e79" exitCode=2 Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.487970 4546 generic.go:334] "Generic (PLEG): container finished" podID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerID="fefa4df7d106b61293016f12588d6abd82428ce3e1b6fcab0480e7103410c527" exitCode=0 Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.487621 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerDied","Data":"bc86699e7d9725585b5f3f4777d648363be52b2a67d6ef9144d0e9e168eb67fe"} Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.488099 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerDied","Data":"e25dec80ca4d32fdfd281d21bd27a1e13d4c3fde6ce743d6799812758b279e79"} Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.488127 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerDied","Data":"fefa4df7d106b61293016f12588d6abd82428ce3e1b6fcab0480e7103410c527"} Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.489995 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f076ee1b-b564-435a-a66f-b061fbc6c8f3","Type":"ContainerStarted","Data":"0e654d2c82833ee2efda65df641216624202bb01f733ecaa48ac03fbcf2d6fc4"} Feb 01 07:00:18 crc kubenswrapper[4546]: I0201 07:00:18.507726 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.203020499 podStartE2EDuration="2.50770097s" podCreationTimestamp="2026-02-01 07:00:16 +0000 UTC" firstStartedPulling="2026-02-01 07:00:17.288100229 +0000 UTC m=+1047.939036245" lastFinishedPulling="2026-02-01 07:00:17.5927807 +0000 UTC m=+1048.243716716" observedRunningTime="2026-02-01 07:00:18.502821341 +0000 UTC m=+1049.153757357" watchObservedRunningTime="2026-02-01 07:00:18.50770097 +0000 UTC m=+1049.158636986" Feb 01 07:00:19 crc kubenswrapper[4546]: I0201 07:00:19.499024 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 01 07:00:21 crc kubenswrapper[4546]: I0201 07:00:21.921112 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:00:21 crc kubenswrapper[4546]: I0201 07:00:21.921593 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:00:21 crc kubenswrapper[4546]: I0201 07:00:21.929470 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:00:21 crc kubenswrapper[4546]: I0201 07:00:21.929922 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.532235 4546 generic.go:334] "Generic (PLEG): container finished" podID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerID="c80b0bfc435be5a41c60880558336e3317d85bdb64376dc5b6efb4ac623bf962" exitCode=0 Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.533080 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerDied","Data":"c80b0bfc435be5a41c60880558336e3317d85bdb64376dc5b6efb4ac623bf962"} Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.533210 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76aad09c-7fb0-499f-afca-a553aab90ad1","Type":"ContainerDied","Data":"7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557"} Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.533251 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7741c209670df836844af586c868b8381d18f1f7f022489dad276807ad9a9557" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.579599 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.669843 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.669969 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.670028 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7nh7\" (UniqueName: \"kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.670244 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.670589 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671098 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671200 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671261 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671292 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data\") pod \"76aad09c-7fb0-499f-afca-a553aab90ad1\" (UID: \"76aad09c-7fb0-499f-afca-a553aab90ad1\") " Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671942 4546 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.671963 4546 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76aad09c-7fb0-499f-afca-a553aab90ad1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.682620 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7" (OuterVolumeSpecName: "kube-api-access-j7nh7") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "kube-api-access-j7nh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.686116 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts" (OuterVolumeSpecName: "scripts") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.738622 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.751390 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.770348 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data" (OuterVolumeSpecName: "config-data") pod "76aad09c-7fb0-499f-afca-a553aab90ad1" (UID: "76aad09c-7fb0-499f-afca-a553aab90ad1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.773105 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.773136 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.773148 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7nh7\" (UniqueName: \"kubernetes.io/projected/76aad09c-7fb0-499f-afca-a553aab90ad1-kube-api-access-j7nh7\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.773160 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:22 crc kubenswrapper[4546]: I0201 07:00:22.773171 4546 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76aad09c-7fb0-499f-afca-a553aab90ad1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.543025 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.584051 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.597971 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.613937 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:23 crc kubenswrapper[4546]: E0201 07:00:23.614794 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="proxy-httpd" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.614896 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="proxy-httpd" Feb 01 07:00:23 crc kubenswrapper[4546]: E0201 07:00:23.614964 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="sg-core" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615017 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="sg-core" Feb 01 07:00:23 crc kubenswrapper[4546]: E0201 07:00:23.615116 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-central-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615174 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-central-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: E0201 07:00:23.615223 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-notification-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615263 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-notification-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615598 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-central-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615660 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="proxy-httpd" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615701 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="sg-core" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.615779 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" containerName="ceilometer-notification-agent" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.618413 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.624518 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.624882 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.625026 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.644689 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.668648 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76aad09c-7fb0-499f-afca-a553aab90ad1" path="/var/lib/kubelet/pods/76aad09c-7fb0-499f-afca-a553aab90ad1/volumes" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.689819 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.689874 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.689946 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.689998 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.690039 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.690143 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.690187 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.690238 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.792584 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793094 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793149 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793175 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793247 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793658 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.793380 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.794024 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.794277 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.794696 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.798801 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.798844 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.799037 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.799450 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.810749 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.811034 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq\") pod \"ceilometer-0\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " pod="openstack/ceilometer-0" Feb 01 07:00:23 crc kubenswrapper[4546]: I0201 07:00:23.938118 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:24 crc kubenswrapper[4546]: W0201 07:00:24.222566 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbab8a7a_f3fa_4933_92ef_6d13c2d03b06.slice/crio-4c4acda9d3da5f7f8a6c6d20173b5d38f02d476d30d48dc53efae3cb79cd1332 WatchSource:0}: Error finding container 4c4acda9d3da5f7f8a6c6d20173b5d38f02d476d30d48dc53efae3cb79cd1332: Status 404 returned error can't find the container with id 4c4acda9d3da5f7f8a6c6d20173b5d38f02d476d30d48dc53efae3cb79cd1332 Feb 01 07:00:24 crc kubenswrapper[4546]: I0201 07:00:24.224631 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:24 crc kubenswrapper[4546]: I0201 07:00:24.225626 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:00:24 crc kubenswrapper[4546]: I0201 07:00:24.555354 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerStarted","Data":"4c4acda9d3da5f7f8a6c6d20173b5d38f02d476d30d48dc53efae3cb79cd1332"} Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.457582 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.534422 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle\") pod \"e44d36d3-89ac-4873-8913-a3a0c6faa798\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.534477 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989d6\" (UniqueName: \"kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6\") pod \"e44d36d3-89ac-4873-8913-a3a0c6faa798\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.534578 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data\") pod \"e44d36d3-89ac-4873-8913-a3a0c6faa798\" (UID: \"e44d36d3-89ac-4873-8913-a3a0c6faa798\") " Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.541662 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6" (OuterVolumeSpecName: "kube-api-access-989d6") pod "e44d36d3-89ac-4873-8913-a3a0c6faa798" (UID: "e44d36d3-89ac-4873-8913-a3a0c6faa798"). InnerVolumeSpecName "kube-api-access-989d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.570462 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e44d36d3-89ac-4873-8913-a3a0c6faa798" (UID: "e44d36d3-89ac-4873-8913-a3a0c6faa798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.572591 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerStarted","Data":"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61"} Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.574295 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data" (OuterVolumeSpecName: "config-data") pod "e44d36d3-89ac-4873-8913-a3a0c6faa798" (UID: "e44d36d3-89ac-4873-8913-a3a0c6faa798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.575270 4546 generic.go:334] "Generic (PLEG): container finished" podID="e44d36d3-89ac-4873-8913-a3a0c6faa798" containerID="40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5" exitCode=137 Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.575344 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e44d36d3-89ac-4873-8913-a3a0c6faa798","Type":"ContainerDied","Data":"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5"} Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.575391 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e44d36d3-89ac-4873-8913-a3a0c6faa798","Type":"ContainerDied","Data":"b30020868fc179ed4e1413a86e04333ce02608f076ed004ac2d01ae39d2e32d2"} Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.575418 4546 scope.go:117] "RemoveContainer" containerID="40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.575474 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.644908 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.645178 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989d6\" (UniqueName: \"kubernetes.io/projected/e44d36d3-89ac-4873-8913-a3a0c6faa798-kube-api-access-989d6\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.645234 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44d36d3-89ac-4873-8913-a3a0c6faa798-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.645608 4546 scope.go:117] "RemoveContainer" containerID="40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5" Feb 01 07:00:25 crc kubenswrapper[4546]: E0201 07:00:25.646282 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5\": container with ID starting with 40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5 not found: ID does not exist" containerID="40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.646330 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5"} err="failed to get container status \"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5\": rpc error: code = NotFound desc = could not find container \"40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5\": container with ID starting with 40a50fc4908a3d2f7fe0e9adf1e27b7ce1d14bcce6970f1523b48a5422082bc5 not found: ID does not exist" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.653220 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.667631 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.679325 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:00:25 crc kubenswrapper[4546]: E0201 07:00:25.682642 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e44d36d3-89ac-4873-8913-a3a0c6faa798" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.682713 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e44d36d3-89ac-4873-8913-a3a0c6faa798" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.682960 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e44d36d3-89ac-4873-8913-a3a0c6faa798" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.684518 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.684663 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.698371 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.698542 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.698540 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.747884 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.748147 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql9kh\" (UniqueName: \"kubernetes.io/projected/9e02f56a-5cc0-4644-bdb8-7ec067852362-kube-api-access-ql9kh\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.748212 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.748279 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.748351 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.851513 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql9kh\" (UniqueName: \"kubernetes.io/projected/9e02f56a-5cc0-4644-bdb8-7ec067852362-kube-api-access-ql9kh\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.852030 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.852083 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.852122 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.852223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.856961 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.858383 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.858900 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.862693 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e02f56a-5cc0-4644-bdb8-7ec067852362-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:25 crc kubenswrapper[4546]: I0201 07:00:25.868896 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql9kh\" (UniqueName: \"kubernetes.io/projected/9e02f56a-5cc0-4644-bdb8-7ec067852362-kube-api-access-ql9kh\") pod \"nova-cell1-novncproxy-0\" (UID: \"9e02f56a-5cc0-4644-bdb8-7ec067852362\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.019718 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.504690 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.586909 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e02f56a-5cc0-4644-bdb8-7ec067852362","Type":"ContainerStarted","Data":"bfb44d1820566ef995595c06c5961191386316a742d0b73ac2473295cc3c3ce1"} Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.589979 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerStarted","Data":"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e"} Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.712259 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.712901 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.713272 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.721690 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:00:26 crc kubenswrapper[4546]: I0201 07:00:26.851557 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.606270 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9e02f56a-5cc0-4644-bdb8-7ec067852362","Type":"ContainerStarted","Data":"235f367178829b904fea660aeb3c357e4fb85393edb43d0fcf773f195daec7c5"} Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.610244 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerStarted","Data":"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5"} Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.610606 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.620187 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.629372 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.629358039 podStartE2EDuration="2.629358039s" podCreationTimestamp="2026-02-01 07:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:27.623700605 +0000 UTC m=+1058.274636621" watchObservedRunningTime="2026-02-01 07:00:27.629358039 +0000 UTC m=+1058.280294046" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.688195 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e44d36d3-89ac-4873-8913-a3a0c6faa798" path="/var/lib/kubelet/pods/e44d36d3-89ac-4873-8913-a3a0c6faa798/volumes" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.845338 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.850703 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.873575 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926251 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926406 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926625 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cgls\" (UniqueName: \"kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926670 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926816 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:27 crc kubenswrapper[4546]: I0201 07:00:27.926841 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.029599 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.029757 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.030590 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.030700 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.030969 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cgls\" (UniqueName: \"kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.031022 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.031152 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.031178 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.032102 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.033054 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.033591 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.051205 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cgls\" (UniqueName: \"kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls\") pod \"dnsmasq-dns-86c594d9d9-q65rt\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.188772 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:28 crc kubenswrapper[4546]: I0201 07:00:28.762938 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:00:28 crc kubenswrapper[4546]: W0201 07:00:28.770981 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf024d767_7d73_468c_a37b_4bab42ab32ba.slice/crio-320c1f53b6215d287144db7c4daa03a5477edecbd8f55f4aff9395284acf97db WatchSource:0}: Error finding container 320c1f53b6215d287144db7c4daa03a5477edecbd8f55f4aff9395284acf97db: Status 404 returned error can't find the container with id 320c1f53b6215d287144db7c4daa03a5477edecbd8f55f4aff9395284acf97db Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.631599 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerStarted","Data":"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea"} Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.631963 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.633530 4546 generic.go:334] "Generic (PLEG): container finished" podID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerID="9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed" exitCode=0 Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.633641 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" event={"ID":"f024d767-7d73-468c-a37b-4bab42ab32ba","Type":"ContainerDied","Data":"9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed"} Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.633690 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" event={"ID":"f024d767-7d73-468c-a37b-4bab42ab32ba","Type":"ContainerStarted","Data":"320c1f53b6215d287144db7c4daa03a5477edecbd8f55f4aff9395284acf97db"} Feb 01 07:00:29 crc kubenswrapper[4546]: I0201 07:00:29.650231 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8659947030000001 podStartE2EDuration="6.650218178s" podCreationTimestamp="2026-02-01 07:00:23 +0000 UTC" firstStartedPulling="2026-02-01 07:00:24.225132108 +0000 UTC m=+1054.876068114" lastFinishedPulling="2026-02-01 07:00:29.009355573 +0000 UTC m=+1059.660291589" observedRunningTime="2026-02-01 07:00:29.647028696 +0000 UTC m=+1060.297964702" watchObservedRunningTime="2026-02-01 07:00:29.650218178 +0000 UTC m=+1060.301154194" Feb 01 07:00:30 crc kubenswrapper[4546]: I0201 07:00:30.063233 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:30 crc kubenswrapper[4546]: I0201 07:00:30.569615 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:30 crc kubenswrapper[4546]: I0201 07:00:30.645415 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" event={"ID":"f024d767-7d73-468c-a37b-4bab42ab32ba","Type":"ContainerStarted","Data":"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035"} Feb 01 07:00:30 crc kubenswrapper[4546]: I0201 07:00:30.645548 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-api" containerID="cri-o://5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648" gracePeriod=30 Feb 01 07:00:30 crc kubenswrapper[4546]: I0201 07:00:30.646359 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-log" containerID="cri-o://832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95" gracePeriod=30 Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.021774 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.664647 4546 generic.go:334] "Generic (PLEG): container finished" podID="acd562a1-1119-4464-8887-bc606ef8cef4" containerID="832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95" exitCode=143 Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.665224 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerDied","Data":"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95"} Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.666341 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.666620 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="proxy-httpd" containerID="cri-o://eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea" gracePeriod=30 Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.666720 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-notification-agent" containerID="cri-o://0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e" gracePeriod=30 Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.666725 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="sg-core" containerID="cri-o://f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5" gracePeriod=30 Feb 01 07:00:31 crc kubenswrapper[4546]: I0201 07:00:31.666598 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-central-agent" containerID="cri-o://ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61" gracePeriod=30 Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.689174 4546 generic.go:334] "Generic (PLEG): container finished" podID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerID="eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea" exitCode=0 Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.689217 4546 generic.go:334] "Generic (PLEG): container finished" podID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerID="f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5" exitCode=2 Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.689226 4546 generic.go:334] "Generic (PLEG): container finished" podID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerID="0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e" exitCode=0 Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.690082 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerDied","Data":"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea"} Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.690141 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerDied","Data":"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5"} Feb 01 07:00:32 crc kubenswrapper[4546]: I0201 07:00:32.690156 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerDied","Data":"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e"} Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.306379 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.323289 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.369021 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" podStartSLOduration=7.368992767 podStartE2EDuration="7.368992767s" podCreationTimestamp="2026-02-01 07:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:30.666273506 +0000 UTC m=+1061.317209523" watchObservedRunningTime="2026-02-01 07:00:34.368992767 +0000 UTC m=+1065.019928782" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.394953 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395020 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395067 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395106 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395165 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trftr\" (UniqueName: \"kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr\") pod \"acd562a1-1119-4464-8887-bc606ef8cef4\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395204 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395252 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle\") pod \"acd562a1-1119-4464-8887-bc606ef8cef4\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395276 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs\") pod \"acd562a1-1119-4464-8887-bc606ef8cef4\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395350 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395374 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395396 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data\") pod \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\" (UID: \"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.395461 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data\") pod \"acd562a1-1119-4464-8887-bc606ef8cef4\" (UID: \"acd562a1-1119-4464-8887-bc606ef8cef4\") " Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.399344 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.399716 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs" (OuterVolumeSpecName: "logs") pod "acd562a1-1119-4464-8887-bc606ef8cef4" (UID: "acd562a1-1119-4464-8887-bc606ef8cef4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.399938 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.445928 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts" (OuterVolumeSpecName: "scripts") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.455052 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq" (OuterVolumeSpecName: "kube-api-access-gcmkq") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "kube-api-access-gcmkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.469288 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr" (OuterVolumeSpecName: "kube-api-access-trftr") pod "acd562a1-1119-4464-8887-bc606ef8cef4" (UID: "acd562a1-1119-4464-8887-bc606ef8cef4"). InnerVolumeSpecName "kube-api-access-trftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509610 4546 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509643 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trftr\" (UniqueName: \"kubernetes.io/projected/acd562a1-1119-4464-8887-bc606ef8cef4-kube-api-access-trftr\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509655 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509668 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd562a1-1119-4464-8887-bc606ef8cef4-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509677 4546 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.509685 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcmkq\" (UniqueName: \"kubernetes.io/projected/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-kube-api-access-gcmkq\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.549060 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data" (OuterVolumeSpecName: "config-data") pod "acd562a1-1119-4464-8887-bc606ef8cef4" (UID: "acd562a1-1119-4464-8887-bc606ef8cef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.549089 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd562a1-1119-4464-8887-bc606ef8cef4" (UID: "acd562a1-1119-4464-8887-bc606ef8cef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.593961 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.611901 4546 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.611932 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.611944 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd562a1-1119-4464-8887-bc606ef8cef4-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.617645 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.619937 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.677154 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data" (OuterVolumeSpecName: "config-data") pod "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" (UID: "bbab8a7a-f3fa-4933-92ef-6d13c2d03b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.718141 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.718507 4546 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.718576 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.720988 4546 generic.go:334] "Generic (PLEG): container finished" podID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerID="ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61" exitCode=0 Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.721087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerDied","Data":"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61"} Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.721136 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbab8a7a-f3fa-4933-92ef-6d13c2d03b06","Type":"ContainerDied","Data":"4c4acda9d3da5f7f8a6c6d20173b5d38f02d476d30d48dc53efae3cb79cd1332"} Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.721156 4546 scope.go:117] "RemoveContainer" containerID="eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.721066 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.724076 4546 generic.go:334] "Generic (PLEG): container finished" podID="acd562a1-1119-4464-8887-bc606ef8cef4" containerID="5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648" exitCode=0 Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.724119 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerDied","Data":"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648"} Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.724148 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"acd562a1-1119-4464-8887-bc606ef8cef4","Type":"ContainerDied","Data":"ec3e032a8148c477075edfcc958007ac0c328ca38481ba96eeb0d6f146bb300e"} Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.724324 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.744422 4546 scope.go:117] "RemoveContainer" containerID="f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.760986 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.766821 4546 scope.go:117] "RemoveContainer" containerID="0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.784720 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.792043 4546 scope.go:117] "RemoveContainer" containerID="ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.802404 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.817580 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818090 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-central-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818109 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-central-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818121 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-log" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818127 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-log" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818143 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="proxy-httpd" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818149 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="proxy-httpd" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818155 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-notification-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818161 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-notification-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818172 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="sg-core" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818177 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="sg-core" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.818197 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-api" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818203 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-api" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818394 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-log" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818403 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" containerName="nova-api-api" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818411 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="proxy-httpd" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818420 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-central-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818427 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="ceilometer-notification-agent" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.818439 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" containerName="sg-core" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.819954 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.823500 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.823682 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.824117 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.826320 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.834308 4546 scope.go:117] "RemoveContainer" containerID="eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.836305 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea\": container with ID starting with eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea not found: ID does not exist" containerID="eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.836347 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea"} err="failed to get container status \"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea\": rpc error: code = NotFound desc = could not find container \"eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea\": container with ID starting with eae27e51e2be1fe6ac92f143b899a14a6b68a55d6cb28e11f480b2de500984ea not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.836373 4546 scope.go:117] "RemoveContainer" containerID="f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.836880 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5\": container with ID starting with f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5 not found: ID does not exist" containerID="f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.836904 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5"} err="failed to get container status \"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5\": rpc error: code = NotFound desc = could not find container \"f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5\": container with ID starting with f97a3c9f2ec4dabe0e9c121e72b3a06ef1f3ea029d8441e4f840e8969b869db5 not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.836919 4546 scope.go:117] "RemoveContainer" containerID="0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.837165 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e\": container with ID starting with 0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e not found: ID does not exist" containerID="0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.837186 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e"} err="failed to get container status \"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e\": rpc error: code = NotFound desc = could not find container \"0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e\": container with ID starting with 0b46770014bae562a644fc339ce4debb62b7c23576a3315e857cd13aa2870b4e not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.837200 4546 scope.go:117] "RemoveContainer" containerID="ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.841013 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61\": container with ID starting with ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61 not found: ID does not exist" containerID="ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.841040 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61"} err="failed to get container status \"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61\": rpc error: code = NotFound desc = could not find container \"ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61\": container with ID starting with ca9b2dcf3672bac5d3f7b83d7af3570f2fc461316583d0b6ee28bb97834d7c61 not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.841055 4546 scope.go:117] "RemoveContainer" containerID="5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.870089 4546 scope.go:117] "RemoveContainer" containerID="832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.874445 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.919055 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.920637 4546 scope.go:117] "RemoveContainer" containerID="5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.921024 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648\": container with ID starting with 5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648 not found: ID does not exist" containerID="5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921083 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648"} err="failed to get container status \"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648\": rpc error: code = NotFound desc = could not find container \"5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648\": container with ID starting with 5fd1a36ff3a327007b653c55a48ee599ffc5075df37321b0ee883c295e5d9648 not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921107 4546 scope.go:117] "RemoveContainer" containerID="832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95" Feb 01 07:00:34 crc kubenswrapper[4546]: E0201 07:00:34.921332 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95\": container with ID starting with 832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95 not found: ID does not exist" containerID="832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921352 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95"} err="failed to get container status \"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95\": rpc error: code = NotFound desc = could not find container \"832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95\": container with ID starting with 832f97cf2dd6222d3ef8b54dc6ee713eb8cddfdb9ec4c758ada5b96d8855ff95 not found: ID does not exist" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921741 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921802 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkk8\" (UniqueName: \"kubernetes.io/projected/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-kube-api-access-flkk8\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921841 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.921849 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.922322 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.922398 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-scripts\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.922483 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-config-data\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.922568 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.922586 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.924585 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.924724 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.925209 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:00:34 crc kubenswrapper[4546]: I0201 07:00:34.942983 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024451 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024499 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flkk8\" (UniqueName: \"kubernetes.io/projected/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-kube-api-access-flkk8\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024530 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024562 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024580 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024606 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024641 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-scripts\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024685 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2fr\" (UniqueName: \"kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024710 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-config-data\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024729 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024768 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024800 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024816 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.024874 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.025401 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-log-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.025582 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-run-httpd\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.029163 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-config-data\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.029775 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.031081 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-scripts\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.031443 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.039418 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.040207 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkk8\" (UniqueName: \"kubernetes.io/projected/f2c8ec52-c2fc-4e69-9bc3-192bb73267a9-kube-api-access-flkk8\") pod \"ceilometer-0\" (UID: \"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9\") " pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.126825 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.126932 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.127036 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2fr\" (UniqueName: \"kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.127108 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.127630 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.128942 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.129442 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.131493 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.132289 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.132676 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.134495 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.142590 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.149292 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2fr\" (UniqueName: \"kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr\") pod \"nova-api-0\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.240125 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:35 crc kubenswrapper[4546]: W0201 07:00:35.616675 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2c8ec52_c2fc_4e69_9bc3_192bb73267a9.slice/crio-89438a1174476be3c285914c6c77e72b2ca147d234628f0a09e4c7983270a6e8 WatchSource:0}: Error finding container 89438a1174476be3c285914c6c77e72b2ca147d234628f0a09e4c7983270a6e8: Status 404 returned error can't find the container with id 89438a1174476be3c285914c6c77e72b2ca147d234628f0a09e4c7983270a6e8 Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.623709 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.667450 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd562a1-1119-4464-8887-bc606ef8cef4" path="/var/lib/kubelet/pods/acd562a1-1119-4464-8887-bc606ef8cef4/volumes" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.668111 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbab8a7a-f3fa-4933-92ef-6d13c2d03b06" path="/var/lib/kubelet/pods/bbab8a7a-f3fa-4933-92ef-6d13c2d03b06/volumes" Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.738650 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9","Type":"ContainerStarted","Data":"89438a1174476be3c285914c6c77e72b2ca147d234628f0a09e4c7983270a6e8"} Feb 01 07:00:35 crc kubenswrapper[4546]: I0201 07:00:35.740953 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.020670 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.045567 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.764421 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerStarted","Data":"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35"} Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.764481 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerStarted","Data":"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936"} Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.764500 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerStarted","Data":"94d70abdfaf8bb3e63138d6b5bd7c60468ea12770be80b15f98a82ee6bcea74a"} Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.769083 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9","Type":"ContainerStarted","Data":"d5c769e408013f3f2f4d16f3fb1433f9374eec6193bb0aad561f3f4dd4861c9e"} Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.788995 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7889781940000002 podStartE2EDuration="2.788978194s" podCreationTimestamp="2026-02-01 07:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:36.782605069 +0000 UTC m=+1067.433541095" watchObservedRunningTime="2026-02-01 07:00:36.788978194 +0000 UTC m=+1067.439914211" Feb 01 07:00:36 crc kubenswrapper[4546]: I0201 07:00:36.797554 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.090840 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-qldlq"] Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.092971 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.097046 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.098077 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.138688 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qldlq"] Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.191334 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.191426 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbrqd\" (UniqueName: \"kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.191477 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.191557 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.294759 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.294945 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbrqd\" (UniqueName: \"kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.295024 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.295155 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.299832 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.303103 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.309326 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.320490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbrqd\" (UniqueName: \"kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd\") pod \"nova-cell1-cell-mapping-qldlq\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.486163 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.783586 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9","Type":"ContainerStarted","Data":"aa5f126ed896c857b3018529500d60a9a8d981d78c0bb0c89417672f2f0da54b"} Feb 01 07:00:37 crc kubenswrapper[4546]: I0201 07:00:37.957475 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qldlq"] Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.191152 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.270416 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.271304 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="dnsmasq-dns" containerID="cri-o://381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8" gracePeriod=10 Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.730981 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.799288 4546 generic.go:334] "Generic (PLEG): container finished" podID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerID="381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8" exitCode=0 Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.799330 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" event={"ID":"43650776-3c2d-4c00-b082-55e3c3a9dce3","Type":"ContainerDied","Data":"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8"} Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.799375 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.799392 4546 scope.go:117] "RemoveContainer" containerID="381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.799378 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-664f5cdb7c-j8rfz" event={"ID":"43650776-3c2d-4c00-b082-55e3c3a9dce3","Type":"ContainerDied","Data":"4311883920c5757173a30b5a8ab485929f68167d253ad9b34ec57c50a2c32633"} Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.802781 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qldlq" event={"ID":"11be2508-ee43-420e-83f9-bb37921807d8","Type":"ContainerStarted","Data":"a5558d53565683dee85253ea038f83ba31a8b2ca401b23441f4b82fe20f040db"} Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.802834 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qldlq" event={"ID":"11be2508-ee43-420e-83f9-bb37921807d8","Type":"ContainerStarted","Data":"c4cda951eb3b3f09db085e9c303cbdde24ff5d5d580e7ef114a528ca9258bc4a"} Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.805226 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9","Type":"ContainerStarted","Data":"96537f263d2b018819cb1a31448938152d464c7569f487b1b6339f000be6fc25"} Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.829052 4546 scope.go:117] "RemoveContainer" containerID="172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.835607 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-qldlq" podStartSLOduration=1.83559162 podStartE2EDuration="1.83559162s" podCreationTimestamp="2026-02-01 07:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:38.820463683 +0000 UTC m=+1069.471399698" watchObservedRunningTime="2026-02-01 07:00:38.83559162 +0000 UTC m=+1069.486527636" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.842608 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9d5n\" (UniqueName: \"kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.842667 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.842706 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.842999 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.843067 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.843098 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb\") pod \"43650776-3c2d-4c00-b082-55e3c3a9dce3\" (UID: \"43650776-3c2d-4c00-b082-55e3c3a9dce3\") " Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.852974 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n" (OuterVolumeSpecName: "kube-api-access-t9d5n") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "kube-api-access-t9d5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.864971 4546 scope.go:117] "RemoveContainer" containerID="381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8" Feb 01 07:00:38 crc kubenswrapper[4546]: E0201 07:00:38.865369 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8\": container with ID starting with 381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8 not found: ID does not exist" containerID="381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.865482 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8"} err="failed to get container status \"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8\": rpc error: code = NotFound desc = could not find container \"381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8\": container with ID starting with 381ddac1b7b59d758e9bec74d18ada2a6f852a4a25ce23b10b2327bd27545fe8 not found: ID does not exist" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.865575 4546 scope.go:117] "RemoveContainer" containerID="172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482" Feb 01 07:00:38 crc kubenswrapper[4546]: E0201 07:00:38.865828 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482\": container with ID starting with 172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482 not found: ID does not exist" containerID="172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.865944 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482"} err="failed to get container status \"172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482\": rpc error: code = NotFound desc = could not find container \"172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482\": container with ID starting with 172ffa251b33bf8d258551ed16e7ab7e1ea3f747752e39a48fc64ee8fdbb4482 not found: ID does not exist" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.912906 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.915463 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.926877 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.927334 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config" (OuterVolumeSpecName: "config") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.946003 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.946024 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.946037 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9d5n\" (UniqueName: \"kubernetes.io/projected/43650776-3c2d-4c00-b082-55e3c3a9dce3-kube-api-access-t9d5n\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.946045 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.946053 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:38 crc kubenswrapper[4546]: I0201 07:00:38.963939 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43650776-3c2d-4c00-b082-55e3c3a9dce3" (UID: "43650776-3c2d-4c00-b082-55e3c3a9dce3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:39 crc kubenswrapper[4546]: I0201 07:00:39.047977 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43650776-3c2d-4c00-b082-55e3c3a9dce3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:39 crc kubenswrapper[4546]: I0201 07:00:39.141316 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 07:00:39 crc kubenswrapper[4546]: I0201 07:00:39.148159 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-664f5cdb7c-j8rfz"] Feb 01 07:00:39 crc kubenswrapper[4546]: I0201 07:00:39.667724 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" path="/var/lib/kubelet/pods/43650776-3c2d-4c00-b082-55e3c3a9dce3/volumes" Feb 01 07:00:40 crc kubenswrapper[4546]: I0201 07:00:40.837077 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2c8ec52-c2fc-4e69-9bc3-192bb73267a9","Type":"ContainerStarted","Data":"9ac7de3321c24ca1f8897705cd0972ea320731a67e6d5026fbaed576bf5a5f34"} Feb 01 07:00:40 crc kubenswrapper[4546]: I0201 07:00:40.838372 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:00:40 crc kubenswrapper[4546]: I0201 07:00:40.866222 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.361686511 podStartE2EDuration="6.86620731s" podCreationTimestamp="2026-02-01 07:00:34 +0000 UTC" firstStartedPulling="2026-02-01 07:00:35.620579343 +0000 UTC m=+1066.271515360" lastFinishedPulling="2026-02-01 07:00:40.125100143 +0000 UTC m=+1070.776036159" observedRunningTime="2026-02-01 07:00:40.855385269 +0000 UTC m=+1071.506321285" watchObservedRunningTime="2026-02-01 07:00:40.86620731 +0000 UTC m=+1071.517143326" Feb 01 07:00:42 crc kubenswrapper[4546]: I0201 07:00:42.865277 4546 generic.go:334] "Generic (PLEG): container finished" podID="11be2508-ee43-420e-83f9-bb37921807d8" containerID="a5558d53565683dee85253ea038f83ba31a8b2ca401b23441f4b82fe20f040db" exitCode=0 Feb 01 07:00:42 crc kubenswrapper[4546]: I0201 07:00:42.865375 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qldlq" event={"ID":"11be2508-ee43-420e-83f9-bb37921807d8","Type":"ContainerDied","Data":"a5558d53565683dee85253ea038f83ba31a8b2ca401b23441f4b82fe20f040db"} Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.170410 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.246550 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts\") pod \"11be2508-ee43-420e-83f9-bb37921807d8\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.246726 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle\") pod \"11be2508-ee43-420e-83f9-bb37921807d8\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.246753 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbrqd\" (UniqueName: \"kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd\") pod \"11be2508-ee43-420e-83f9-bb37921807d8\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.246817 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data\") pod \"11be2508-ee43-420e-83f9-bb37921807d8\" (UID: \"11be2508-ee43-420e-83f9-bb37921807d8\") " Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.251696 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd" (OuterVolumeSpecName: "kube-api-access-pbrqd") pod "11be2508-ee43-420e-83f9-bb37921807d8" (UID: "11be2508-ee43-420e-83f9-bb37921807d8"). InnerVolumeSpecName "kube-api-access-pbrqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.252649 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts" (OuterVolumeSpecName: "scripts") pod "11be2508-ee43-420e-83f9-bb37921807d8" (UID: "11be2508-ee43-420e-83f9-bb37921807d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.270452 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data" (OuterVolumeSpecName: "config-data") pod "11be2508-ee43-420e-83f9-bb37921807d8" (UID: "11be2508-ee43-420e-83f9-bb37921807d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.277018 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11be2508-ee43-420e-83f9-bb37921807d8" (UID: "11be2508-ee43-420e-83f9-bb37921807d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.350201 4546 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.350239 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.350254 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbrqd\" (UniqueName: \"kubernetes.io/projected/11be2508-ee43-420e-83f9-bb37921807d8-kube-api-access-pbrqd\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.350264 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11be2508-ee43-420e-83f9-bb37921807d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.885724 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qldlq" event={"ID":"11be2508-ee43-420e-83f9-bb37921807d8","Type":"ContainerDied","Data":"c4cda951eb3b3f09db085e9c303cbdde24ff5d5d580e7ef114a528ca9258bc4a"} Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.886021 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4cda951eb3b3f09db085e9c303cbdde24ff5d5d580e7ef114a528ca9258bc4a" Feb 01 07:00:44 crc kubenswrapper[4546]: I0201 07:00:44.885810 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qldlq" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.082265 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.083149 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-log" containerID="cri-o://1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" gracePeriod=30 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.083242 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-api" containerID="cri-o://b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" gracePeriod=30 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.104281 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.104490 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerName="nova-scheduler-scheduler" containerID="cri-o://6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" gracePeriod=30 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.115786 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.116053 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" containerID="cri-o://b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c" gracePeriod=30 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.116180 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" containerID="cri-o://a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b" gracePeriod=30 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.643568 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.679603 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.680602 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.680745 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.680844 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.680962 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt2fr\" (UniqueName: \"kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.681073 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs\") pod \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\" (UID: \"6061bfb6-0e3d-431e-be4e-87cf1efe9868\") " Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.685538 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs" (OuterVolumeSpecName: "logs") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.697212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr" (OuterVolumeSpecName: "kube-api-access-qt2fr") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "kube-api-access-qt2fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.714762 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data" (OuterVolumeSpecName: "config-data") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.736422 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.736841 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.757751 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6061bfb6-0e3d-431e-be4e-87cf1efe9868" (UID: "6061bfb6-0e3d-431e-be4e-87cf1efe9868"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.783933 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.783964 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.783976 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.783987 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt2fr\" (UniqueName: \"kubernetes.io/projected/6061bfb6-0e3d-431e-be4e-87cf1efe9868-kube-api-access-qt2fr\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.783998 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6061bfb6-0e3d-431e-be4e-87cf1efe9868-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.784011 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6061bfb6-0e3d-431e-be4e-87cf1efe9868-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.904921 4546 generic.go:334] "Generic (PLEG): container finished" podID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerID="b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c" exitCode=143 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.905011 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerDied","Data":"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c"} Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908646 4546 generic.go:334] "Generic (PLEG): container finished" podID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerID="b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" exitCode=0 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908681 4546 generic.go:334] "Generic (PLEG): container finished" podID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerID="1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" exitCode=143 Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908711 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerDied","Data":"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35"} Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908733 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerDied","Data":"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936"} Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908748 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6061bfb6-0e3d-431e-be4e-87cf1efe9868","Type":"ContainerDied","Data":"94d70abdfaf8bb3e63138d6b5bd7c60468ea12770be80b15f98a82ee6bcea74a"} Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908766 4546 scope.go:117] "RemoveContainer" containerID="b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.908990 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.945900 4546 scope.go:117] "RemoveContainer" containerID="1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.958993 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.965666 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.974930 4546 scope.go:117] "RemoveContainer" containerID="b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.975358 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35\": container with ID starting with b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35 not found: ID does not exist" containerID="b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.975398 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35"} err="failed to get container status \"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35\": rpc error: code = NotFound desc = could not find container \"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35\": container with ID starting with b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35 not found: ID does not exist" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.975421 4546 scope.go:117] "RemoveContainer" containerID="1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.975643 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936\": container with ID starting with 1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936 not found: ID does not exist" containerID="1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.975669 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936"} err="failed to get container status \"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936\": rpc error: code = NotFound desc = could not find container \"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936\": container with ID starting with 1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936 not found: ID does not exist" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.975704 4546 scope.go:117] "RemoveContainer" containerID="b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.976160 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35"} err="failed to get container status \"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35\": rpc error: code = NotFound desc = could not find container \"b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35\": container with ID starting with b6fb03608578a634f593c6840cc2b9a7d55db8b592a9c1cd7c7f57b5806d8e35 not found: ID does not exist" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.976180 4546 scope.go:117] "RemoveContainer" containerID="1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.976371 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.976388 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936"} err="failed to get container status \"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936\": rpc error: code = NotFound desc = could not find container \"1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936\": container with ID starting with 1947e39e95331486ce7ff6d373ef034e54da088b8c8e24eb602118120c3f2936 not found: ID does not exist" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.978593 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-api" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978616 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-api" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.978627 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="init" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978635 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="init" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.978648 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11be2508-ee43-420e-83f9-bb37921807d8" containerName="nova-manage" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978662 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="11be2508-ee43-420e-83f9-bb37921807d8" containerName="nova-manage" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.978681 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="dnsmasq-dns" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978687 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="dnsmasq-dns" Feb 01 07:00:45 crc kubenswrapper[4546]: E0201 07:00:45.978697 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-log" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978702 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-log" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978913 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-log" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978932 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="11be2508-ee43-420e-83f9-bb37921807d8" containerName="nova-manage" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978946 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" containerName="nova-api-api" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.978959 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="43650776-3c2d-4c00-b082-55e3c3a9dce3" containerName="dnsmasq-dns" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.980387 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.984198 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.984366 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.984498 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 01 07:00:45 crc kubenswrapper[4546]: I0201 07:00:45.995286 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.090827 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-config-data\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.091199 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cffcac-b561-4376-9a36-5dddb2ad36fa-logs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.091248 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhm6b\" (UniqueName: \"kubernetes.io/projected/c2cffcac-b561-4376-9a36-5dddb2ad36fa-kube-api-access-lhm6b\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.091289 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.091335 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.091364 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193215 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-config-data\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193267 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cffcac-b561-4376-9a36-5dddb2ad36fa-logs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193303 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhm6b\" (UniqueName: \"kubernetes.io/projected/c2cffcac-b561-4376-9a36-5dddb2ad36fa-kube-api-access-lhm6b\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193342 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193389 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.193418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.195035 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2cffcac-b561-4376-9a36-5dddb2ad36fa-logs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.200434 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.200612 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-config-data\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.201213 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.201621 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2cffcac-b561-4376-9a36-5dddb2ad36fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.210555 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhm6b\" (UniqueName: \"kubernetes.io/projected/c2cffcac-b561-4376-9a36-5dddb2ad36fa-kube-api-access-lhm6b\") pod \"nova-api-0\" (UID: \"c2cffcac-b561-4376-9a36-5dddb2ad36fa\") " pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.299076 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:00:46 crc kubenswrapper[4546]: E0201 07:00:46.672206 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:46 crc kubenswrapper[4546]: E0201 07:00:46.673929 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:46 crc kubenswrapper[4546]: E0201 07:00:46.675298 4546 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:00:46 crc kubenswrapper[4546]: E0201 07:00:46.675363 4546 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerName="nova-scheduler-scheduler" Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.733105 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.933900 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cffcac-b561-4376-9a36-5dddb2ad36fa","Type":"ContainerStarted","Data":"85c5c3c4c605d886f76698066073b0e673ecdd37f4321cdeb5103f1dbdc0c73c"} Feb 01 07:00:46 crc kubenswrapper[4546]: I0201 07:00:46.934300 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cffcac-b561-4376-9a36-5dddb2ad36fa","Type":"ContainerStarted","Data":"f138b05b1716a97e1506e731c142cfdc0830d42c0dc2dea013511391408664f1"} Feb 01 07:00:47 crc kubenswrapper[4546]: I0201 07:00:47.668649 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6061bfb6-0e3d-431e-be4e-87cf1efe9868" path="/var/lib/kubelet/pods/6061bfb6-0e3d-431e-be4e-87cf1efe9868/volumes" Feb 01 07:00:47 crc kubenswrapper[4546]: I0201 07:00:47.946776 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2cffcac-b561-4376-9a36-5dddb2ad36fa","Type":"ContainerStarted","Data":"eb297f12fdad93269f684e092c88a54d508d8eb2a67c57ca4ea6da724b3dbeef"} Feb 01 07:00:47 crc kubenswrapper[4546]: I0201 07:00:47.972786 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.972762696 podStartE2EDuration="2.972762696s" podCreationTimestamp="2026-02-01 07:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:47.964344867 +0000 UTC m=+1078.615280882" watchObservedRunningTime="2026-02-01 07:00:47.972762696 +0000 UTC m=+1078.623698701" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.252586 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:38214->10.217.0.210:8775: read: connection reset by peer" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.252616 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:38208->10.217.0.210:8775: read: connection reset by peer" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.676850 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.770284 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtr7w\" (UniqueName: \"kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w\") pod \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.770425 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle\") pod \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.770521 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data\") pod \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.770707 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs\") pod \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.770777 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs\") pod \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\" (UID: \"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af\") " Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.772027 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs" (OuterVolumeSpecName: "logs") pod "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" (UID: "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.781002 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w" (OuterVolumeSpecName: "kube-api-access-xtr7w") pod "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" (UID: "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af"). InnerVolumeSpecName "kube-api-access-xtr7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.805035 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" (UID: "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.822749 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data" (OuterVolumeSpecName: "config-data") pod "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" (UID: "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.870894 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" (UID: "9e47d60a-0b20-4ebb-8ac8-bfbd33e312af"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.876416 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtr7w\" (UniqueName: \"kubernetes.io/projected/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-kube-api-access-xtr7w\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.876449 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.876459 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.876470 4546 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.876479 4546 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.957594 4546 generic.go:334] "Generic (PLEG): container finished" podID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerID="a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b" exitCode=0 Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.957649 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.957707 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerDied","Data":"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b"} Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.957736 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e47d60a-0b20-4ebb-8ac8-bfbd33e312af","Type":"ContainerDied","Data":"cb1791c989372e568365a6d1b209be72e778e32b9f9ee5c804a72a612d236119"} Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.957755 4546 scope.go:117] "RemoveContainer" containerID="a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.985382 4546 scope.go:117] "RemoveContainer" containerID="b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c" Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.989104 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:48 crc kubenswrapper[4546]: I0201 07:00:48.997267 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.004763 4546 scope.go:117] "RemoveContainer" containerID="a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b" Feb 01 07:00:49 crc kubenswrapper[4546]: E0201 07:00:49.005336 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b\": container with ID starting with a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b not found: ID does not exist" containerID="a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.005385 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b"} err="failed to get container status \"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b\": rpc error: code = NotFound desc = could not find container \"a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b\": container with ID starting with a28f55f4d3d3f23cc849909ae323cb5663e697f1de878b535e1a8486a83e143b not found: ID does not exist" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.005419 4546 scope.go:117] "RemoveContainer" containerID="b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c" Feb 01 07:00:49 crc kubenswrapper[4546]: E0201 07:00:49.005966 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c\": container with ID starting with b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c not found: ID does not exist" containerID="b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.006024 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c"} err="failed to get container status \"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c\": rpc error: code = NotFound desc = could not find container \"b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c\": container with ID starting with b1aebe9e48f6b168e092de0e09f94cbb2dccda1791ee9d10f8f89e1bc0da345c not found: ID does not exist" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.017203 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:49 crc kubenswrapper[4546]: E0201 07:00:49.017769 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.017791 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" Feb 01 07:00:49 crc kubenswrapper[4546]: E0201 07:00:49.017835 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.017842 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.018128 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-log" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.018144 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" containerName="nova-metadata-metadata" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.019617 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.031288 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.031602 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.044027 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.083959 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-logs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.084140 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.084543 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.084960 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-config-data\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.085161 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nxc\" (UniqueName: \"kubernetes.io/projected/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-kube-api-access-v6nxc\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.187335 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nxc\" (UniqueName: \"kubernetes.io/projected/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-kube-api-access-v6nxc\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.187787 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-logs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.188138 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-logs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.188273 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.188886 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.189065 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-config-data\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.192734 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.193113 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-config-data\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.194494 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.213243 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nxc\" (UniqueName: \"kubernetes.io/projected/1d2d7d7b-324f-4442-b6be-fccd76a2b3d2-kube-api-access-v6nxc\") pod \"nova-metadata-0\" (UID: \"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2\") " pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.350835 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.666287 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e47d60a-0b20-4ebb-8ac8-bfbd33e312af" path="/var/lib/kubelet/pods/9e47d60a-0b20-4ebb-8ac8-bfbd33e312af/volumes" Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.837235 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:00:49 crc kubenswrapper[4546]: I0201 07:00:49.968932 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2","Type":"ContainerStarted","Data":"c3e46caa0caa333ed8599e4a3be0b7c9347062f7a10dd7805be794cdceef13f4"} Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.316763 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.414520 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle\") pod \"0c54e3bc-bf1b-4a51-946d-be6858436839\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.414763 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data\") pod \"0c54e3bc-bf1b-4a51-946d-be6858436839\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.414829 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbxfs\" (UniqueName: \"kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs\") pod \"0c54e3bc-bf1b-4a51-946d-be6858436839\" (UID: \"0c54e3bc-bf1b-4a51-946d-be6858436839\") " Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.429261 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs" (OuterVolumeSpecName: "kube-api-access-xbxfs") pod "0c54e3bc-bf1b-4a51-946d-be6858436839" (UID: "0c54e3bc-bf1b-4a51-946d-be6858436839"). InnerVolumeSpecName "kube-api-access-xbxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.450882 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c54e3bc-bf1b-4a51-946d-be6858436839" (UID: "0c54e3bc-bf1b-4a51-946d-be6858436839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.454977 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data" (OuterVolumeSpecName: "config-data") pod "0c54e3bc-bf1b-4a51-946d-be6858436839" (UID: "0c54e3bc-bf1b-4a51-946d-be6858436839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.517229 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.517258 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbxfs\" (UniqueName: \"kubernetes.io/projected/0c54e3bc-bf1b-4a51-946d-be6858436839-kube-api-access-xbxfs\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.517271 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c54e3bc-bf1b-4a51-946d-be6858436839-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.985244 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2","Type":"ContainerStarted","Data":"f89590e2b0ddcd15b8afabe8b83a7cbd3bb341573c82d8aaddf1947bef8ee8e8"} Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.985331 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d2d7d7b-324f-4442-b6be-fccd76a2b3d2","Type":"ContainerStarted","Data":"36b5f712786c54c446acaa49bfb3d377e20ce6e4c6ca5a676d0db95eb7461c29"} Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.990691 4546 generic.go:334] "Generic (PLEG): container finished" podID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" exitCode=0 Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.990774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c54e3bc-bf1b-4a51-946d-be6858436839","Type":"ContainerDied","Data":"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971"} Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.990837 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c54e3bc-bf1b-4a51-946d-be6858436839","Type":"ContainerDied","Data":"cc5aeedece70386c63982a832186a55b8d142769bfd7bcfa20fa39088e89618c"} Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.990888 4546 scope.go:117] "RemoveContainer" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" Feb 01 07:00:50 crc kubenswrapper[4546]: I0201 07:00:50.991070 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.017636 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.017615896 podStartE2EDuration="3.017615896s" podCreationTimestamp="2026-02-01 07:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:51.004719314 +0000 UTC m=+1081.655655330" watchObservedRunningTime="2026-02-01 07:00:51.017615896 +0000 UTC m=+1081.668551912" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.018275 4546 scope.go:117] "RemoveContainer" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" Feb 01 07:00:51 crc kubenswrapper[4546]: E0201 07:00:51.018677 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971\": container with ID starting with 6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971 not found: ID does not exist" containerID="6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.018719 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971"} err="failed to get container status \"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971\": rpc error: code = NotFound desc = could not find container \"6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971\": container with ID starting with 6c17a5070cc33db92c2868584282a9506db396d4a801c52cd4477ecb6532a971 not found: ID does not exist" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.039509 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.053226 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.060997 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:51 crc kubenswrapper[4546]: E0201 07:00:51.061431 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerName="nova-scheduler-scheduler" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.061467 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerName="nova-scheduler-scheduler" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.061676 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" containerName="nova-scheduler-scheduler" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.062387 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.068279 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.083531 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.133448 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.133497 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58mk\" (UniqueName: \"kubernetes.io/projected/99e07819-983e-47cb-a647-cf8d2a1ff5fd-kube-api-access-f58mk\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.133803 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-config-data\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.236501 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-config-data\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.236818 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.236896 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58mk\" (UniqueName: \"kubernetes.io/projected/99e07819-983e-47cb-a647-cf8d2a1ff5fd-kube-api-access-f58mk\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.245670 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.247405 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99e07819-983e-47cb-a647-cf8d2a1ff5fd-config-data\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.251550 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58mk\" (UniqueName: \"kubernetes.io/projected/99e07819-983e-47cb-a647-cf8d2a1ff5fd-kube-api-access-f58mk\") pod \"nova-scheduler-0\" (UID: \"99e07819-983e-47cb-a647-cf8d2a1ff5fd\") " pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.389091 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.665725 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c54e3bc-bf1b-4a51-946d-be6858436839" path="/var/lib/kubelet/pods/0c54e3bc-bf1b-4a51-946d-be6858436839/volumes" Feb 01 07:00:51 crc kubenswrapper[4546]: I0201 07:00:51.817360 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:00:52 crc kubenswrapper[4546]: I0201 07:00:52.018580 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99e07819-983e-47cb-a647-cf8d2a1ff5fd","Type":"ContainerStarted","Data":"4aed185aaa4420c2742a24153c3744181076a2f788e7e7ba5eeca279dc39dca1"} Feb 01 07:00:52 crc kubenswrapper[4546]: I0201 07:00:52.018975 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"99e07819-983e-47cb-a647-cf8d2a1ff5fd","Type":"ContainerStarted","Data":"13ae55040b43e13a14b0dce0e043e0f17a4bc8ba0cd937c676e85a5dbb6fdbd9"} Feb 01 07:00:52 crc kubenswrapper[4546]: I0201 07:00:52.038545 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.038530785 podStartE2EDuration="1.038530785s" podCreationTimestamp="2026-02-01 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:00:52.033634936 +0000 UTC m=+1082.684570952" watchObservedRunningTime="2026-02-01 07:00:52.038530785 +0000 UTC m=+1082.689466801" Feb 01 07:00:54 crc kubenswrapper[4546]: I0201 07:00:54.351014 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:00:54 crc kubenswrapper[4546]: I0201 07:00:54.351526 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:00:55 crc kubenswrapper[4546]: I0201 07:00:55.420484 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:00:55 crc kubenswrapper[4546]: I0201 07:00:55.420840 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:00:56 crc kubenswrapper[4546]: I0201 07:00:56.301960 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:00:56 crc kubenswrapper[4546]: I0201 07:00:56.302041 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:00:56 crc kubenswrapper[4546]: I0201 07:00:56.389921 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 07:00:57 crc kubenswrapper[4546]: I0201 07:00:57.323029 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2cffcac-b561-4376-9a36-5dddb2ad36fa" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:57 crc kubenswrapper[4546]: I0201 07:00:57.323062 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2cffcac-b561-4376-9a36-5dddb2ad36fa" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:00:59 crc kubenswrapper[4546]: I0201 07:00:59.351081 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:00:59 crc kubenswrapper[4546]: I0201 07:00:59.351524 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.142284 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29498821-c5ttk"] Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.143992 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.162409 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498821-c5ttk"] Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.249534 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.249590 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.249799 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.250026 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkkcb\" (UniqueName: \"kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.360218 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkkcb\" (UniqueName: \"kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.360385 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.360531 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.360738 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.362556 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d2d7d7b-324f-4442-b6be-fccd76a2b3d2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.362842 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d2d7d7b-324f-4442-b6be-fccd76a2b3d2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.377129 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.380499 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.381434 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkkcb\" (UniqueName: \"kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.384390 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys\") pod \"keystone-cron-29498821-c5ttk\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.481610 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:00 crc kubenswrapper[4546]: I0201 07:01:00.918653 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498821-c5ttk"] Feb 01 07:01:01 crc kubenswrapper[4546]: I0201 07:01:01.120332 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498821-c5ttk" event={"ID":"b918cde3-d2a1-466b-9ae2-32d239389735","Type":"ContainerStarted","Data":"1cc0afa42a2be50ad3909d78a187431406014c851566be1c5cb955d4ff817a00"} Feb 01 07:01:01 crc kubenswrapper[4546]: I0201 07:01:01.120386 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498821-c5ttk" event={"ID":"b918cde3-d2a1-466b-9ae2-32d239389735","Type":"ContainerStarted","Data":"bc6603ad19aed5465fcc04edac24cbff5490ef4d994459c25d98661b1335184a"} Feb 01 07:01:01 crc kubenswrapper[4546]: I0201 07:01:01.135662 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29498821-c5ttk" podStartSLOduration=1.135635983 podStartE2EDuration="1.135635983s" podCreationTimestamp="2026-02-01 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:01:01.134944689 +0000 UTC m=+1091.785880705" watchObservedRunningTime="2026-02-01 07:01:01.135635983 +0000 UTC m=+1091.786571999" Feb 01 07:01:01 crc kubenswrapper[4546]: I0201 07:01:01.389837 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 07:01:01 crc kubenswrapper[4546]: I0201 07:01:01.422745 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 07:01:02 crc kubenswrapper[4546]: I0201 07:01:02.156637 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 07:01:04 crc kubenswrapper[4546]: I0201 07:01:04.146100 4546 generic.go:334] "Generic (PLEG): container finished" podID="b918cde3-d2a1-466b-9ae2-32d239389735" containerID="1cc0afa42a2be50ad3909d78a187431406014c851566be1c5cb955d4ff817a00" exitCode=0 Feb 01 07:01:04 crc kubenswrapper[4546]: I0201 07:01:04.146210 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498821-c5ttk" event={"ID":"b918cde3-d2a1-466b-9ae2-32d239389735","Type":"ContainerDied","Data":"1cc0afa42a2be50ad3909d78a187431406014c851566be1c5cb955d4ff817a00"} Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.154765 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.482536 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.577073 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data\") pod \"b918cde3-d2a1-466b-9ae2-32d239389735\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.577486 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys\") pod \"b918cde3-d2a1-466b-9ae2-32d239389735\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.577517 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle\") pod \"b918cde3-d2a1-466b-9ae2-32d239389735\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.577641 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkkcb\" (UniqueName: \"kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb\") pod \"b918cde3-d2a1-466b-9ae2-32d239389735\" (UID: \"b918cde3-d2a1-466b-9ae2-32d239389735\") " Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.584763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b918cde3-d2a1-466b-9ae2-32d239389735" (UID: "b918cde3-d2a1-466b-9ae2-32d239389735"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.584846 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb" (OuterVolumeSpecName: "kube-api-access-pkkcb") pod "b918cde3-d2a1-466b-9ae2-32d239389735" (UID: "b918cde3-d2a1-466b-9ae2-32d239389735"). InnerVolumeSpecName "kube-api-access-pkkcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.604198 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b918cde3-d2a1-466b-9ae2-32d239389735" (UID: "b918cde3-d2a1-466b-9ae2-32d239389735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.626476 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data" (OuterVolumeSpecName: "config-data") pod "b918cde3-d2a1-466b-9ae2-32d239389735" (UID: "b918cde3-d2a1-466b-9ae2-32d239389735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.680136 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkkcb\" (UniqueName: \"kubernetes.io/projected/b918cde3-d2a1-466b-9ae2-32d239389735-kube-api-access-pkkcb\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.680165 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.680177 4546 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:05 crc kubenswrapper[4546]: I0201 07:01:05.680187 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b918cde3-d2a1-466b-9ae2-32d239389735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.167153 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498821-c5ttk" event={"ID":"b918cde3-d2a1-466b-9ae2-32d239389735","Type":"ContainerDied","Data":"bc6603ad19aed5465fcc04edac24cbff5490ef4d994459c25d98661b1335184a"} Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.167222 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc6603ad19aed5465fcc04edac24cbff5490ef4d994459c25d98661b1335184a" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.168344 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498821-c5ttk" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.313867 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.315022 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.315292 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:01:06 crc kubenswrapper[4546]: I0201 07:01:06.327750 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:01:07 crc kubenswrapper[4546]: I0201 07:01:07.178160 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:01:07 crc kubenswrapper[4546]: I0201 07:01:07.186703 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:01:09 crc kubenswrapper[4546]: I0201 07:01:09.356955 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:01:09 crc kubenswrapper[4546]: I0201 07:01:09.358091 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:01:09 crc kubenswrapper[4546]: I0201 07:01:09.362993 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:01:10 crc kubenswrapper[4546]: I0201 07:01:10.211986 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:01:16 crc kubenswrapper[4546]: I0201 07:01:16.866621 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:17 crc kubenswrapper[4546]: I0201 07:01:17.699257 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:21 crc kubenswrapper[4546]: I0201 07:01:21.799307 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="rabbitmq" containerID="cri-o://7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5" gracePeriod=604796 Feb 01 07:01:22 crc kubenswrapper[4546]: I0201 07:01:22.135308 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="rabbitmq" containerID="cri-o://618ce162921f9b2a8ee7ddb9d0c2ca5cb307fca3dff89e03352d2a264ff4e972" gracePeriod=604796 Feb 01 07:01:25 crc kubenswrapper[4546]: I0201 07:01:25.420573 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:01:25 crc kubenswrapper[4546]: I0201 07:01:25.420887 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.398575 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:27 crc kubenswrapper[4546]: E0201 07:01:27.399661 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b918cde3-d2a1-466b-9ae2-32d239389735" containerName="keystone-cron" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.399680 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b918cde3-d2a1-466b-9ae2-32d239389735" containerName="keystone-cron" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.399948 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b918cde3-d2a1-466b-9ae2-32d239389735" containerName="keystone-cron" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.401071 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.405011 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.415842 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489271 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489384 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489407 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489455 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489492 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsw6c\" (UniqueName: \"kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.489715 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.592926 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593092 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593123 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593192 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593248 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593282 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsw6c\" (UniqueName: \"kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.593350 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594062 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594115 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594183 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594303 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594501 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.594761 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.612486 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsw6c\" (UniqueName: \"kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c\") pod \"dnsmasq-dns-7458997b65-dpmh6\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.727922 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.810511 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: connect: connection refused" Feb 01 07:01:27 crc kubenswrapper[4546]: I0201 07:01:27.901571 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.204219 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.294387 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.414828 4546 generic.go:334] "Generic (PLEG): container finished" podID="f9259854-6c00-413e-9061-399c808d9360" containerID="7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5" exitCode=0 Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.414973 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerDied","Data":"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5"} Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.415014 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f9259854-6c00-413e-9061-399c808d9360","Type":"ContainerDied","Data":"7b05fa1d0884ba595ff157260bc518ba05036ede66618511bdd4c27aaae77078"} Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.415035 4546 scope.go:117] "RemoveContainer" containerID="7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.415209 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.416767 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.416888 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.417006 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.417096 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.418149 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkfdp\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.418257 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.419814 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.421066 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.423305 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.423408 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.424280 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerDied","Data":"618ce162921f9b2a8ee7ddb9d0c2ca5cb307fca3dff89e03352d2a264ff4e972"} Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.424248 4546 generic.go:334] "Generic (PLEG): container finished" podID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerID="618ce162921f9b2a8ee7ddb9d0c2ca5cb307fca3dff89e03352d2a264ff4e972" exitCode=0 Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.424564 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.424682 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.425064 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf\") pod \"f9259854-6c00-413e-9061-399c808d9360\" (UID: \"f9259854-6c00-413e-9061-399c808d9360\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.425107 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.425845 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.426422 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.426817 4546 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.428568 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.429470 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.429671 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info" (OuterVolumeSpecName: "pod-info") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.429982 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.430065 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" event={"ID":"840e0532-bc87-41b3-8424-3d4128bd9dcf","Type":"ContainerStarted","Data":"e4e70fba14ca9f0189a18630708ac4a656d2172a1be373125864dc09cb187d5b"} Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.431133 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp" (OuterVolumeSpecName: "kube-api-access-pkfdp") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "kube-api-access-pkfdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.445009 4546 scope.go:117] "RemoveContainer" containerID="de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.504097 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data" (OuterVolumeSpecName: "config-data") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.529207 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf" (OuterVolumeSpecName: "server-conf") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.529961 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530006 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530022 4546 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f9259854-6c00-413e-9061-399c808d9360-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530033 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkfdp\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-kube-api-access-pkfdp\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530046 4546 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f9259854-6c00-413e-9061-399c808d9360-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530054 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.530062 4546 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f9259854-6c00-413e-9061-399c808d9360-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.534485 4546 scope.go:117] "RemoveContainer" containerID="7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5" Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.538811 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5\": container with ID starting with 7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5 not found: ID does not exist" containerID="7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.538848 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5"} err="failed to get container status \"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5\": rpc error: code = NotFound desc = could not find container \"7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5\": container with ID starting with 7557f27ac0b0cdcb8b248470a4623c58c8946a2bcdbff33c7bb7a09990f346f5 not found: ID does not exist" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.538905 4546 scope.go:117] "RemoveContainer" containerID="de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9" Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.539321 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9\": container with ID starting with de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9 not found: ID does not exist" containerID="de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.539360 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9"} err="failed to get container status \"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9\": rpc error: code = NotFound desc = could not find container \"de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9\": container with ID starting with de7b15123b245cfe1f5fc1d9d7ded586969ebe40e1bdf520e965075ba8b657b9 not found: ID does not exist" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.565803 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.566186 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f9259854-6c00-413e-9061-399c808d9360" (UID: "f9259854-6c00-413e-9061-399c808d9360"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.568428 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.633641 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f9259854-6c00-413e-9061-399c808d9360-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.633806 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734241 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734423 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734462 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734477 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734548 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734572 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734595 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734684 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734715 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734738 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.734784 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f4fv\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv\") pod \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\" (UID: \"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973\") " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.737419 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.737929 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.739253 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.742543 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.753969 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info" (OuterVolumeSpecName: "pod-info") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.761169 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.765121 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.765252 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv" (OuterVolumeSpecName: "kube-api-access-7f4fv") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "kube-api-access-7f4fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.773092 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.791695 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.821775 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf" (OuterVolumeSpecName: "server-conf") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.824267 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.824761 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="setup-container" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.824777 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="setup-container" Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.824787 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.824793 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.824832 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="setup-container" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.824838 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="setup-container" Feb 01 07:01:28 crc kubenswrapper[4546]: E0201 07:01:28.824851 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.824881 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.825058 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.825070 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9259854-6c00-413e-9061-399c808d9360" containerName="rabbitmq" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.829714 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849676 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849704 4546 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849717 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849728 4546 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849736 4546 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849745 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849765 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849774 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f4fv\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-kube-api-access-7f4fv\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.849782 4546 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.854126 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.854363 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.854524 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.854765 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.855245 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5njj2" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.856654 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.856817 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.880772 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.910518 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data" (OuterVolumeSpecName: "config-data") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.911817 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951286 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951406 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951451 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951484 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951517 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951543 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951560 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tps9c\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-kube-api-access-tps9c\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951586 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951625 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951692 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951708 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951804 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.951821 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:28 crc kubenswrapper[4546]: I0201 07:01:28.975168 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" (UID: "3a322342-7fc8-41ca-9ee3-4e1bbdbf5973"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053193 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053252 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053723 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053817 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053870 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053903 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053927 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053945 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053960 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tps9c\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-kube-api-access-tps9c\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.053979 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054006 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054052 4546 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054326 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054553 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054837 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.054847 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.055059 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.056575 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-config-data\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.058900 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.059276 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.059404 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.061373 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.072324 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tps9c\" (UniqueName: \"kubernetes.io/projected/f431e944-ac18-4e93-8146-3cf7c7ebfa3f-kube-api-access-tps9c\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.091975 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"f431e944-ac18-4e93-8146-3cf7c7ebfa3f\") " pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.175574 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.440209 4546 generic.go:334] "Generic (PLEG): container finished" podID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerID="d556dc6867e3d693bb940fd4f4dd2665295bfa4ebdecb4df00d4809f871d7548" exitCode=0 Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.440270 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" event={"ID":"840e0532-bc87-41b3-8424-3d4128bd9dcf","Type":"ContainerDied","Data":"d556dc6867e3d693bb940fd4f4dd2665295bfa4ebdecb4df00d4809f871d7548"} Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.447770 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3a322342-7fc8-41ca-9ee3-4e1bbdbf5973","Type":"ContainerDied","Data":"bd652127bb336c49fc87e99870b0a02ea8a4daf1718f26a3068b719eb0804b62"} Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.448110 4546 scope.go:117] "RemoveContainer" containerID="618ce162921f9b2a8ee7ddb9d0c2ca5cb307fca3dff89e03352d2a264ff4e972" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.448034 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.482983 4546 scope.go:117] "RemoveContainer" containerID="9ec81dd258fc5363154282f0f86b3edb322ae34700105e5e89c739bb777690b0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.498384 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.507944 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.523042 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.524958 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.530023 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.530904 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.530952 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.532657 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.533334 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.533463 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.533773 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6xlwf" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.540802 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.595334 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.668720 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a322342-7fc8-41ca-9ee3-4e1bbdbf5973" path="/var/lib/kubelet/pods/3a322342-7fc8-41ca-9ee3-4e1bbdbf5973/volumes" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.668920 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3863a8d0-fb87-4e6f-9432-7832ed43f243-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.668974 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669025 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3863a8d0-fb87-4e6f-9432-7832ed43f243-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669059 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669089 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669115 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669136 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bchm6\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-kube-api-access-bchm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669152 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669169 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669399 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.669454 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.670047 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9259854-6c00-413e-9061-399c808d9360" path="/var/lib/kubelet/pods/f9259854-6c00-413e-9061-399c808d9360/volumes" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.771604 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3863a8d0-fb87-4e6f-9432-7832ed43f243-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.771724 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772283 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772333 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3863a8d0-fb87-4e6f-9432-7832ed43f243-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772376 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772409 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772445 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772468 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bchm6\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-kube-api-access-bchm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772488 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772507 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.772668 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.773669 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.773757 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.774053 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.774195 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.774703 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.776088 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3863a8d0-fb87-4e6f-9432-7832ed43f243-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.776311 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3863a8d0-fb87-4e6f-9432-7832ed43f243-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.782272 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3863a8d0-fb87-4e6f-9432-7832ed43f243-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.782433 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.783707 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.792708 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bchm6\" (UniqueName: \"kubernetes.io/projected/3863a8d0-fb87-4e6f-9432-7832ed43f243-kube-api-access-bchm6\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.840836 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3863a8d0-fb87-4e6f-9432-7832ed43f243\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:29 crc kubenswrapper[4546]: I0201 07:01:29.845728 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.255341 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:01:30 crc kubenswrapper[4546]: W0201 07:01:30.335107 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3863a8d0_fb87_4e6f_9432_7832ed43f243.slice/crio-babfd587594afe2b6427337c95ce9fe39f0f651e7398687729130aff16ea6fb2 WatchSource:0}: Error finding container babfd587594afe2b6427337c95ce9fe39f0f651e7398687729130aff16ea6fb2: Status 404 returned error can't find the container with id babfd587594afe2b6427337c95ce9fe39f0f651e7398687729130aff16ea6fb2 Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.461275 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" event={"ID":"840e0532-bc87-41b3-8424-3d4128bd9dcf","Type":"ContainerStarted","Data":"8fa60457d50efff95e583abd45633b9512a7bf13f7f25e6159e499fc3ce9fdd9"} Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.461414 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.463442 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3863a8d0-fb87-4e6f-9432-7832ed43f243","Type":"ContainerStarted","Data":"babfd587594afe2b6427337c95ce9fe39f0f651e7398687729130aff16ea6fb2"} Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.465334 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f431e944-ac18-4e93-8146-3cf7c7ebfa3f","Type":"ContainerStarted","Data":"aafc0096eec292db35f928082a82b020a37b46366ac994bb3bce90c1a7bf0fb3"} Feb 01 07:01:30 crc kubenswrapper[4546]: I0201 07:01:30.480083 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" podStartSLOduration=3.480066857 podStartE2EDuration="3.480066857s" podCreationTimestamp="2026-02-01 07:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:01:30.47639824 +0000 UTC m=+1121.127334257" watchObservedRunningTime="2026-02-01 07:01:30.480066857 +0000 UTC m=+1121.131002873" Feb 01 07:01:31 crc kubenswrapper[4546]: I0201 07:01:31.476541 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f431e944-ac18-4e93-8146-3cf7c7ebfa3f","Type":"ContainerStarted","Data":"33a146697873270301cfc673b5961ebbba17e2a983c144fd2478e03a0b5890cd"} Feb 01 07:01:31 crc kubenswrapper[4546]: I0201 07:01:31.478724 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3863a8d0-fb87-4e6f-9432-7832ed43f243","Type":"ContainerStarted","Data":"d6864c4b19e98f0670550676e314f5d98ab7fdd45cdd14d0c27e46f1a89c1ac1"} Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.730028 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.773388 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.773785 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="dnsmasq-dns" containerID="cri-o://6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035" gracePeriod=10 Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.948645 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7965c8b9f5-xg4r5"] Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.950445 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:37 crc kubenswrapper[4546]: I0201 07:01:37.964716 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7965c8b9f5-xg4r5"] Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037584 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-nb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037696 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-openstack-edpm-ipam\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037762 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtl4c\" (UniqueName: \"kubernetes.io/projected/d728cb43-ccb7-4ed7-a301-be192eea5698-kube-api-access-qtl4c\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037779 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-svc\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037811 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-sb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037837 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-swift-storage-0\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.037895 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-config\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.139743 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-swift-storage-0\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.139802 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-config\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.139955 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-nb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140035 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-openstack-edpm-ipam\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140142 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtl4c\" (UniqueName: \"kubernetes.io/projected/d728cb43-ccb7-4ed7-a301-be192eea5698-kube-api-access-qtl4c\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140158 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-svc\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140202 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-sb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140685 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-swift-storage-0\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.140922 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-sb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.141281 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-openstack-edpm-ipam\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.141762 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-dns-svc\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.141788 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-config\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.142281 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d728cb43-ccb7-4ed7-a301-be192eea5698-ovsdbserver-nb\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.161559 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtl4c\" (UniqueName: \"kubernetes.io/projected/d728cb43-ccb7-4ed7-a301-be192eea5698-kube-api-access-qtl4c\") pod \"dnsmasq-dns-7965c8b9f5-xg4r5\" (UID: \"d728cb43-ccb7-4ed7-a301-be192eea5698\") " pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.271898 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.294220 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.446193 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.446309 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.446333 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.446946 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.446998 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cgls\" (UniqueName: \"kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.447070 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc\") pod \"f024d767-7d73-468c-a37b-4bab42ab32ba\" (UID: \"f024d767-7d73-468c-a37b-4bab42ab32ba\") " Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.456217 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls" (OuterVolumeSpecName: "kube-api-access-8cgls") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "kube-api-access-8cgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.495607 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.499643 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config" (OuterVolumeSpecName: "config") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.521061 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.528404 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.541459 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f024d767-7d73-468c-a37b-4bab42ab32ba" (UID: "f024d767-7d73-468c-a37b-4bab42ab32ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.547079 4546 generic.go:334] "Generic (PLEG): container finished" podID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerID="6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035" exitCode=0 Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.547115 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" event={"ID":"f024d767-7d73-468c-a37b-4bab42ab32ba","Type":"ContainerDied","Data":"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035"} Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.547145 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" event={"ID":"f024d767-7d73-468c-a37b-4bab42ab32ba","Type":"ContainerDied","Data":"320c1f53b6215d287144db7c4daa03a5477edecbd8f55f4aff9395284acf97db"} Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.547165 4546 scope.go:117] "RemoveContainer" containerID="6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.547301 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.551792 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.552382 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.552407 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.552418 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cgls\" (UniqueName: \"kubernetes.io/projected/f024d767-7d73-468c-a37b-4bab42ab32ba-kube-api-access-8cgls\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.552428 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.552438 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f024d767-7d73-468c-a37b-4bab42ab32ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.591200 4546 scope.go:117] "RemoveContainer" containerID="9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.598504 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.619022 4546 scope.go:117] "RemoveContainer" containerID="6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.619690 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86c594d9d9-q65rt"] Feb 01 07:01:38 crc kubenswrapper[4546]: E0201 07:01:38.620090 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035\": container with ID starting with 6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035 not found: ID does not exist" containerID="6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.620128 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035"} err="failed to get container status \"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035\": rpc error: code = NotFound desc = could not find container \"6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035\": container with ID starting with 6217d81fc21f3b1338748dc078ddfcac2c51ae54ce09b11f22a93877c1641035 not found: ID does not exist" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.620157 4546 scope.go:117] "RemoveContainer" containerID="9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed" Feb 01 07:01:38 crc kubenswrapper[4546]: E0201 07:01:38.620423 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed\": container with ID starting with 9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed not found: ID does not exist" containerID="9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.620460 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed"} err="failed to get container status \"9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed\": rpc error: code = NotFound desc = could not find container \"9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed\": container with ID starting with 9ee218e3e1f1acbb72d3f56a91dbc36290b5cb86cf0372454ac9f5db30a6b4ed not found: ID does not exist" Feb 01 07:01:38 crc kubenswrapper[4546]: I0201 07:01:38.774600 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7965c8b9f5-xg4r5"] Feb 01 07:01:39 crc kubenswrapper[4546]: I0201 07:01:39.575640 4546 generic.go:334] "Generic (PLEG): container finished" podID="d728cb43-ccb7-4ed7-a301-be192eea5698" containerID="a69ee285dc5363a87bed566cc85e730b2023d79d21f806c427f7d781194ff1e3" exitCode=0 Feb 01 07:01:39 crc kubenswrapper[4546]: I0201 07:01:39.575991 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" event={"ID":"d728cb43-ccb7-4ed7-a301-be192eea5698","Type":"ContainerDied","Data":"a69ee285dc5363a87bed566cc85e730b2023d79d21f806c427f7d781194ff1e3"} Feb 01 07:01:39 crc kubenswrapper[4546]: I0201 07:01:39.576031 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" event={"ID":"d728cb43-ccb7-4ed7-a301-be192eea5698","Type":"ContainerStarted","Data":"c08a053952c157df39fe11f8d81a084296ac72a6c4049f5c16b0b024a7a4363f"} Feb 01 07:01:39 crc kubenswrapper[4546]: I0201 07:01:39.668252 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" path="/var/lib/kubelet/pods/f024d767-7d73-468c-a37b-4bab42ab32ba/volumes" Feb 01 07:01:40 crc kubenswrapper[4546]: I0201 07:01:40.585672 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" event={"ID":"d728cb43-ccb7-4ed7-a301-be192eea5698","Type":"ContainerStarted","Data":"1f2fae4740200073f187697c311c5e50c827fce5f4b99d7c48d9f00d2b1291e6"} Feb 01 07:01:40 crc kubenswrapper[4546]: I0201 07:01:40.586181 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:40 crc kubenswrapper[4546]: I0201 07:01:40.610534 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" podStartSLOduration=3.610519724 podStartE2EDuration="3.610519724s" podCreationTimestamp="2026-02-01 07:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:01:40.600684222 +0000 UTC m=+1131.251620239" watchObservedRunningTime="2026-02-01 07:01:40.610519724 +0000 UTC m=+1131.261455740" Feb 01 07:01:43 crc kubenswrapper[4546]: I0201 07:01:43.190078 4546 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86c594d9d9-q65rt" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.216:5353: i/o timeout" Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.275081 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7965c8b9f5-xg4r5" Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.320982 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.321234 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="dnsmasq-dns" containerID="cri-o://8fa60457d50efff95e583abd45633b9512a7bf13f7f25e6159e499fc3ce9fdd9" gracePeriod=10 Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.657134 4546 generic.go:334] "Generic (PLEG): container finished" podID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerID="8fa60457d50efff95e583abd45633b9512a7bf13f7f25e6159e499fc3ce9fdd9" exitCode=0 Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.657199 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" event={"ID":"840e0532-bc87-41b3-8424-3d4128bd9dcf","Type":"ContainerDied","Data":"8fa60457d50efff95e583abd45633b9512a7bf13f7f25e6159e499fc3ce9fdd9"} Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.809386 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.958772 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959233 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959293 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsw6c\" (UniqueName: \"kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959337 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959437 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959510 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:48 crc kubenswrapper[4546]: I0201 07:01:48.959533 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb\") pod \"840e0532-bc87-41b3-8424-3d4128bd9dcf\" (UID: \"840e0532-bc87-41b3-8424-3d4128bd9dcf\") " Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.010249 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c" (OuterVolumeSpecName: "kube-api-access-dsw6c") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "kube-api-access-dsw6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.062666 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsw6c\" (UniqueName: \"kubernetes.io/projected/840e0532-bc87-41b3-8424-3d4128bd9dcf-kube-api-access-dsw6c\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.070884 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.081746 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.122871 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.127775 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.129063 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config" (OuterVolumeSpecName: "config") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.166050 4546 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.166085 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.166100 4546 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.166111 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.166120 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.170882 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "840e0532-bc87-41b3-8424-3d4128bd9dcf" (UID: "840e0532-bc87-41b3-8424-3d4128bd9dcf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.267687 4546 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/840e0532-bc87-41b3-8424-3d4128bd9dcf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.702769 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.707740 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7458997b65-dpmh6" event={"ID":"840e0532-bc87-41b3-8424-3d4128bd9dcf","Type":"ContainerDied","Data":"e4e70fba14ca9f0189a18630708ac4a656d2172a1be373125864dc09cb187d5b"} Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.707799 4546 scope.go:117] "RemoveContainer" containerID="8fa60457d50efff95e583abd45633b9512a7bf13f7f25e6159e499fc3ce9fdd9" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.734693 4546 scope.go:117] "RemoveContainer" containerID="d556dc6867e3d693bb940fd4f4dd2665295bfa4ebdecb4df00d4809f871d7548" Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.764119 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:49 crc kubenswrapper[4546]: I0201 07:01:49.784566 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7458997b65-dpmh6"] Feb 01 07:01:51 crc kubenswrapper[4546]: I0201 07:01:51.664335 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" path="/var/lib/kubelet/pods/840e0532-bc87-41b3-8424-3d4128bd9dcf/volumes" Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.420349 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.421087 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.421138 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.422056 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.422122 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6" gracePeriod=600 Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.774319 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6" exitCode=0 Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.774474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6"} Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.774638 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e"} Feb 01 07:01:55 crc kubenswrapper[4546]: I0201 07:01:55.774664 4546 scope.go:117] "RemoveContainer" containerID="bf1cd428222258ce8831b2b35aceea3cc1215cfdc89e91fc366faeefbc43f53d" Feb 01 07:02:02 crc kubenswrapper[4546]: I0201 07:02:02.855044 4546 generic.go:334] "Generic (PLEG): container finished" podID="f431e944-ac18-4e93-8146-3cf7c7ebfa3f" containerID="33a146697873270301cfc673b5961ebbba17e2a983c144fd2478e03a0b5890cd" exitCode=0 Feb 01 07:02:02 crc kubenswrapper[4546]: I0201 07:02:02.855128 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f431e944-ac18-4e93-8146-3cf7c7ebfa3f","Type":"ContainerDied","Data":"33a146697873270301cfc673b5961ebbba17e2a983c144fd2478e03a0b5890cd"} Feb 01 07:02:03 crc kubenswrapper[4546]: I0201 07:02:03.866426 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f431e944-ac18-4e93-8146-3cf7c7ebfa3f","Type":"ContainerStarted","Data":"b502cdbbb00629185ca058dcf45001a246495d00105186b6c0601fd44fc65cc7"} Feb 01 07:02:03 crc kubenswrapper[4546]: I0201 07:02:03.867879 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 01 07:02:03 crc kubenswrapper[4546]: I0201 07:02:03.869500 4546 generic.go:334] "Generic (PLEG): container finished" podID="3863a8d0-fb87-4e6f-9432-7832ed43f243" containerID="d6864c4b19e98f0670550676e314f5d98ab7fdd45cdd14d0c27e46f1a89c1ac1" exitCode=0 Feb 01 07:02:03 crc kubenswrapper[4546]: I0201 07:02:03.869550 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3863a8d0-fb87-4e6f-9432-7832ed43f243","Type":"ContainerDied","Data":"d6864c4b19e98f0670550676e314f5d98ab7fdd45cdd14d0c27e46f1a89c1ac1"} Feb 01 07:02:03 crc kubenswrapper[4546]: I0201 07:02:03.895275 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.89525886 podStartE2EDuration="35.89525886s" podCreationTimestamp="2026-02-01 07:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:02:03.895003609 +0000 UTC m=+1154.545939625" watchObservedRunningTime="2026-02-01 07:02:03.89525886 +0000 UTC m=+1154.546194876" Feb 01 07:02:04 crc kubenswrapper[4546]: I0201 07:02:04.882688 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3863a8d0-fb87-4e6f-9432-7832ed43f243","Type":"ContainerStarted","Data":"4835782f406ba1745f6d5ad480cb09eeb65ba2469f3f99cebe02f0189744b70d"} Feb 01 07:02:04 crc kubenswrapper[4546]: I0201 07:02:04.883729 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:02:04 crc kubenswrapper[4546]: I0201 07:02:04.908175 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.90815021 podStartE2EDuration="35.90815021s" podCreationTimestamp="2026-02-01 07:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:02:04.901470938 +0000 UTC m=+1155.552406954" watchObservedRunningTime="2026-02-01 07:02:04.90815021 +0000 UTC m=+1155.559086226" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.510013 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt"] Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.510742 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="init" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.510759 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="init" Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.510780 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.510785 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.510798 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="init" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.510804 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="init" Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.510836 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.510842 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.511121 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f024d767-7d73-468c-a37b-4bab42ab32ba" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.511156 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="840e0532-bc87-41b3-8424-3d4128bd9dcf" containerName="dnsmasq-dns" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.511909 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: W0201 07:02:06.514818 4546 reflector.go:561] object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm": failed to list *v1.Secret: secrets "openstack-edpm-ipam-dockercfg-pctfm" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.514897 4546 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-edpm-ipam-dockercfg-pctfm\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-edpm-ipam-dockercfg-pctfm\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 07:02:06 crc kubenswrapper[4546]: W0201 07:02:06.515370 4546 reflector.go:561] object-"openstack"/"dataplane-ansible-ssh-private-key-secret": failed to list *v1.Secret: secrets "dataplane-ansible-ssh-private-key-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.515394 4546 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dataplane-ansible-ssh-private-key-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dataplane-ansible-ssh-private-key-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 07:02:06 crc kubenswrapper[4546]: W0201 07:02:06.515429 4546 reflector.go:561] object-"openstack"/"dataplanenodeset-openstack-edpm-ipam": failed to list *v1.Secret: secrets "dataplanenodeset-openstack-edpm-ipam" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 01 07:02:06 crc kubenswrapper[4546]: W0201 07:02:06.515436 4546 reflector.go:561] object-"openstack"/"openstack-aee-default-env": failed to list *v1.ConfigMap: configmaps "openstack-aee-default-env" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.515465 4546 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"dataplanenodeset-openstack-edpm-ipam\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"dataplanenodeset-openstack-edpm-ipam\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 07:02:06 crc kubenswrapper[4546]: E0201 07:02:06.515487 4546 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-aee-default-env\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-aee-default-env\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.529209 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt"] Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.593628 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.593693 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.593761 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.593936 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smz4r\" (UniqueName: \"kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.696489 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.696571 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.696624 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.696676 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smz4r\" (UniqueName: \"kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.712407 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:06 crc kubenswrapper[4546]: I0201 07:02:06.723484 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smz4r\" (UniqueName: \"kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:07 crc kubenswrapper[4546]: I0201 07:02:07.396247 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:02:07 crc kubenswrapper[4546]: I0201 07:02:07.658764 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:02:07 crc kubenswrapper[4546]: I0201 07:02:07.670096 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:07 crc kubenswrapper[4546]: E0201 07:02:07.697658 4546 secret.go:188] Couldn't get secret openstack/dataplanenodeset-openstack-edpm-ipam: failed to sync secret cache: timed out waiting for the condition Feb 01 07:02:07 crc kubenswrapper[4546]: E0201 07:02:07.697811 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory podName:a1186a63-073a-4853-8f33-37c1a9bfd55d nodeName:}" failed. No retries permitted until 2026-02-01 07:02:08.197793657 +0000 UTC m=+1158.848729673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory") pod "repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" (UID: "a1186a63-073a-4853-8f33-37c1a9bfd55d") : failed to sync secret cache: timed out waiting for the condition Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.017014 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.034762 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.223527 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.228746 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.329403 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.890485 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt"] Feb 01 07:02:08 crc kubenswrapper[4546]: I0201 07:02:08.930489 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" event={"ID":"a1186a63-073a-4853-8f33-37c1a9bfd55d","Type":"ContainerStarted","Data":"11e17b1939669e5b6c15e33bc66fc2133ad26b8219a0f35828bd498ecd81d7ae"} Feb 01 07:02:19 crc kubenswrapper[4546]: I0201 07:02:19.178023 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 01 07:02:19 crc kubenswrapper[4546]: I0201 07:02:19.851029 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:02:20 crc kubenswrapper[4546]: I0201 07:02:20.050123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" event={"ID":"a1186a63-073a-4853-8f33-37c1a9bfd55d","Type":"ContainerStarted","Data":"084af8cc4885c9b8f98fef69a2cfa056df9606b65c98a548a51b4c1dd12b7f9a"} Feb 01 07:02:20 crc kubenswrapper[4546]: I0201 07:02:20.075822 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" podStartSLOduration=3.776742114 podStartE2EDuration="14.075806147s" podCreationTimestamp="2026-02-01 07:02:06 +0000 UTC" firstStartedPulling="2026-02-01 07:02:08.912926532 +0000 UTC m=+1159.563862537" lastFinishedPulling="2026-02-01 07:02:19.211990554 +0000 UTC m=+1169.862926570" observedRunningTime="2026-02-01 07:02:20.067437279 +0000 UTC m=+1170.718373296" watchObservedRunningTime="2026-02-01 07:02:20.075806147 +0000 UTC m=+1170.726742162" Feb 01 07:02:32 crc kubenswrapper[4546]: I0201 07:02:32.180018 4546 generic.go:334] "Generic (PLEG): container finished" podID="a1186a63-073a-4853-8f33-37c1a9bfd55d" containerID="084af8cc4885c9b8f98fef69a2cfa056df9606b65c98a548a51b4c1dd12b7f9a" exitCode=0 Feb 01 07:02:32 crc kubenswrapper[4546]: I0201 07:02:32.180114 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" event={"ID":"a1186a63-073a-4853-8f33-37c1a9bfd55d","Type":"ContainerDied","Data":"084af8cc4885c9b8f98fef69a2cfa056df9606b65c98a548a51b4c1dd12b7f9a"} Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.627827 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.679510 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smz4r\" (UniqueName: \"kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r\") pod \"a1186a63-073a-4853-8f33-37c1a9bfd55d\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.679568 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam\") pod \"a1186a63-073a-4853-8f33-37c1a9bfd55d\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.694785 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r" (OuterVolumeSpecName: "kube-api-access-smz4r") pod "a1186a63-073a-4853-8f33-37c1a9bfd55d" (UID: "a1186a63-073a-4853-8f33-37c1a9bfd55d"). InnerVolumeSpecName "kube-api-access-smz4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.711170 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1186a63-073a-4853-8f33-37c1a9bfd55d" (UID: "a1186a63-073a-4853-8f33-37c1a9bfd55d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.781179 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle\") pod \"a1186a63-073a-4853-8f33-37c1a9bfd55d\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.781219 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") pod \"a1186a63-073a-4853-8f33-37c1a9bfd55d\" (UID: \"a1186a63-073a-4853-8f33-37c1a9bfd55d\") " Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.781774 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smz4r\" (UniqueName: \"kubernetes.io/projected/a1186a63-073a-4853-8f33-37c1a9bfd55d-kube-api-access-smz4r\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.781795 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.784494 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a1186a63-073a-4853-8f33-37c1a9bfd55d" (UID: "a1186a63-073a-4853-8f33-37c1a9bfd55d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.801527 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory" (OuterVolumeSpecName: "inventory") pod "a1186a63-073a-4853-8f33-37c1a9bfd55d" (UID: "a1186a63-073a-4853-8f33-37c1a9bfd55d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.883390 4546 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:33 crc kubenswrapper[4546]: I0201 07:02:33.883416 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1186a63-073a-4853-8f33-37c1a9bfd55d-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.204738 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" event={"ID":"a1186a63-073a-4853-8f33-37c1a9bfd55d","Type":"ContainerDied","Data":"11e17b1939669e5b6c15e33bc66fc2133ad26b8219a0f35828bd498ecd81d7ae"} Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.205012 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e17b1939669e5b6c15e33bc66fc2133ad26b8219a0f35828bd498ecd81d7ae" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.204805 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-k48lt" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.277196 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h"] Feb 01 07:02:34 crc kubenswrapper[4546]: E0201 07:02:34.277664 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1186a63-073a-4853-8f33-37c1a9bfd55d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.277684 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1186a63-073a-4853-8f33-37c1a9bfd55d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.277900 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1186a63-073a-4853-8f33-37c1a9bfd55d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.278603 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.283331 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.283507 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.283714 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.284754 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.298914 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h"] Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.393894 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.394288 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.394457 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88zc\" (UniqueName: \"kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.496933 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k88zc\" (UniqueName: \"kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.497049 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.497383 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.502041 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.502447 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.512657 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k88zc\" (UniqueName: \"kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7dc8h\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:34 crc kubenswrapper[4546]: I0201 07:02:34.602312 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:35 crc kubenswrapper[4546]: I0201 07:02:35.116872 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h"] Feb 01 07:02:35 crc kubenswrapper[4546]: I0201 07:02:35.217030 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" event={"ID":"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7","Type":"ContainerStarted","Data":"53d0d6e7a533f3cb11aae1f92e125196b1d3dacc9c6cf876214d12942977e309"} Feb 01 07:02:36 crc kubenswrapper[4546]: I0201 07:02:36.227622 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" event={"ID":"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7","Type":"ContainerStarted","Data":"34d3ce728cffff10c849ca26662066a8c7b5932a6b26e5943823eafcaba7f3b3"} Feb 01 07:02:36 crc kubenswrapper[4546]: I0201 07:02:36.246115 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" podStartSLOduration=1.790390864 podStartE2EDuration="2.246097431s" podCreationTimestamp="2026-02-01 07:02:34 +0000 UTC" firstStartedPulling="2026-02-01 07:02:35.120876489 +0000 UTC m=+1185.771812505" lastFinishedPulling="2026-02-01 07:02:35.576583056 +0000 UTC m=+1186.227519072" observedRunningTime="2026-02-01 07:02:36.239791935 +0000 UTC m=+1186.890727951" watchObservedRunningTime="2026-02-01 07:02:36.246097431 +0000 UTC m=+1186.897033448" Feb 01 07:02:38 crc kubenswrapper[4546]: I0201 07:02:38.251490 4546 generic.go:334] "Generic (PLEG): container finished" podID="9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" containerID="34d3ce728cffff10c849ca26662066a8c7b5932a6b26e5943823eafcaba7f3b3" exitCode=0 Feb 01 07:02:38 crc kubenswrapper[4546]: I0201 07:02:38.251787 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" event={"ID":"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7","Type":"ContainerDied","Data":"34d3ce728cffff10c849ca26662066a8c7b5932a6b26e5943823eafcaba7f3b3"} Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.637838 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.817031 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam\") pod \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.817191 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory\") pod \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.817403 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k88zc\" (UniqueName: \"kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc\") pod \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\" (UID: \"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7\") " Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.827592 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc" (OuterVolumeSpecName: "kube-api-access-k88zc") pod "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" (UID: "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7"). InnerVolumeSpecName "kube-api-access-k88zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.845299 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory" (OuterVolumeSpecName: "inventory") pod "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" (UID: "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.845768 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" (UID: "9253d7fa-4c3c-42a8-9fb6-77de712b9ac7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.920913 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k88zc\" (UniqueName: \"kubernetes.io/projected/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-kube-api-access-k88zc\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.921193 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:39 crc kubenswrapper[4546]: I0201 07:02:39.921205 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9253d7fa-4c3c-42a8-9fb6-77de712b9ac7-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.278993 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.278848 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7dc8h" event={"ID":"9253d7fa-4c3c-42a8-9fb6-77de712b9ac7","Type":"ContainerDied","Data":"53d0d6e7a533f3cb11aae1f92e125196b1d3dacc9c6cf876214d12942977e309"} Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.279516 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d0d6e7a533f3cb11aae1f92e125196b1d3dacc9c6cf876214d12942977e309" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.370990 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8"] Feb 01 07:02:40 crc kubenswrapper[4546]: E0201 07:02:40.371675 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.371712 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.372037 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="9253d7fa-4c3c-42a8-9fb6-77de712b9ac7" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.373328 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.377959 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.378238 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.378412 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.378441 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.384637 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8"] Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.535444 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.535536 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkp4c\" (UniqueName: \"kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.535586 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.535629 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.638205 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkp4c\" (UniqueName: \"kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.638273 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.638330 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.638417 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.644847 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.645250 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.652435 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.656008 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkp4c\" (UniqueName: \"kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:40 crc kubenswrapper[4546]: I0201 07:02:40.689047 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:02:41 crc kubenswrapper[4546]: I0201 07:02:41.223603 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8"] Feb 01 07:02:41 crc kubenswrapper[4546]: I0201 07:02:41.288062 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" event={"ID":"62eaca86-490a-4079-b79c-700c6b51135c","Type":"ContainerStarted","Data":"9d68ee4faf367f4eef46848cbbe223205440aad1ec3606297dd1446eb4dedbfe"} Feb 01 07:02:42 crc kubenswrapper[4546]: I0201 07:02:42.301944 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" event={"ID":"62eaca86-490a-4079-b79c-700c6b51135c","Type":"ContainerStarted","Data":"4e1508d9e7960cf3709a5e9813be9a6f592d3ef76eb8a3d5055149c3a5bea8b5"} Feb 01 07:02:42 crc kubenswrapper[4546]: I0201 07:02:42.319335 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" podStartSLOduration=1.8050743900000001 podStartE2EDuration="2.319318832s" podCreationTimestamp="2026-02-01 07:02:40 +0000 UTC" firstStartedPulling="2026-02-01 07:02:41.239557398 +0000 UTC m=+1191.890493414" lastFinishedPulling="2026-02-01 07:02:41.75380184 +0000 UTC m=+1192.404737856" observedRunningTime="2026-02-01 07:02:42.316789303 +0000 UTC m=+1192.967725318" watchObservedRunningTime="2026-02-01 07:02:42.319318832 +0000 UTC m=+1192.970254847" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.364489 4546 scope.go:117] "RemoveContainer" containerID="883e7ee92dc9c006c2ee3b5023b11b62bca942ddbb338cc21935a5813481af6a" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.395622 4546 scope.go:117] "RemoveContainer" containerID="7d53d77a49298035e2d463605f6879ba3935311b8c9719d2e624c97d587d6bad" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.420902 4546 scope.go:117] "RemoveContainer" containerID="dee064dc14987663f756eb4931182ab2deefc780c8cfee58b0b3ca9c9b214e18" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.457023 4546 scope.go:117] "RemoveContainer" containerID="e9354b7a901cca0c83049b71242cddc2500c450838deed30777b03f96b810771" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.484192 4546 scope.go:117] "RemoveContainer" containerID="1d488405bad77d40bd293106247f4d348d0b9cec9baa599ea3c7b30c45f30da1" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.518312 4546 scope.go:117] "RemoveContainer" containerID="5b13d6e367fce99405d6116e016d5e5e8001d97ba89a91a14de572c1e7865c95" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.559982 4546 scope.go:117] "RemoveContainer" containerID="c1c094bd713953a8c2476ed5f01fd66f263e757dbc3cd9d624f3a1677a9fc68e" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.582168 4546 scope.go:117] "RemoveContainer" containerID="8d7a360baf46332323a6ee69d6a6a60b8132bd387a7109471db6ea3cc3774242" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.601508 4546 scope.go:117] "RemoveContainer" containerID="4af2fb52b3c3c8f1e22ecec2409f13938bb3ed88af61e88c2cf0e3b79d7c102c" Feb 01 07:02:54 crc kubenswrapper[4546]: I0201 07:02:54.640734 4546 scope.go:117] "RemoveContainer" containerID="91b9bce1ce39d7be6584b1e8919e1537f07fc027dc46be0fd6b79ceb47b68739" Feb 01 07:03:54 crc kubenswrapper[4546]: I0201 07:03:54.798154 4546 scope.go:117] "RemoveContainer" containerID="d5da42df714ce3e5a28a35bcb3189a9b69fb2324994c13a75e54bb8948ec49ca" Feb 01 07:03:54 crc kubenswrapper[4546]: I0201 07:03:54.829172 4546 scope.go:117] "RemoveContainer" containerID="123f4bf90781e8a1b921df34730c8727ba577ffffeb26b87a686fa6a3fd0d2f9" Feb 01 07:03:55 crc kubenswrapper[4546]: I0201 07:03:55.420514 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:03:55 crc kubenswrapper[4546]: I0201 07:03:55.420577 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:04:25 crc kubenswrapper[4546]: I0201 07:04:25.421110 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:04:25 crc kubenswrapper[4546]: I0201 07:04:25.421919 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.420780 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.421432 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.421478 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.422353 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.422407 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e" gracePeriod=600 Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.612514 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e" exitCode=0 Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.612562 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e"} Feb 01 07:04:55 crc kubenswrapper[4546]: I0201 07:04:55.612652 4546 scope.go:117] "RemoveContainer" containerID="ff755a5e7d12266478d722de4d8aa4b38f438587098c0c86e0cd4cb579735ed6" Feb 01 07:04:56 crc kubenswrapper[4546]: I0201 07:04:56.622457 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f"} Feb 01 07:05:41 crc kubenswrapper[4546]: I0201 07:05:41.080292 4546 generic.go:334] "Generic (PLEG): container finished" podID="62eaca86-490a-4079-b79c-700c6b51135c" containerID="4e1508d9e7960cf3709a5e9813be9a6f592d3ef76eb8a3d5055149c3a5bea8b5" exitCode=0 Feb 01 07:05:41 crc kubenswrapper[4546]: I0201 07:05:41.081310 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" event={"ID":"62eaca86-490a-4079-b79c-700c6b51135c","Type":"ContainerDied","Data":"4e1508d9e7960cf3709a5e9813be9a6f592d3ef76eb8a3d5055149c3a5bea8b5"} Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.447820 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.586777 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory\") pod \"62eaca86-490a-4079-b79c-700c6b51135c\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.587092 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam\") pod \"62eaca86-490a-4079-b79c-700c6b51135c\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.587184 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle\") pod \"62eaca86-490a-4079-b79c-700c6b51135c\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.587324 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkp4c\" (UniqueName: \"kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c\") pod \"62eaca86-490a-4079-b79c-700c6b51135c\" (UID: \"62eaca86-490a-4079-b79c-700c6b51135c\") " Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.593019 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c" (OuterVolumeSpecName: "kube-api-access-rkp4c") pod "62eaca86-490a-4079-b79c-700c6b51135c" (UID: "62eaca86-490a-4079-b79c-700c6b51135c"). InnerVolumeSpecName "kube-api-access-rkp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.593520 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "62eaca86-490a-4079-b79c-700c6b51135c" (UID: "62eaca86-490a-4079-b79c-700c6b51135c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.613934 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62eaca86-490a-4079-b79c-700c6b51135c" (UID: "62eaca86-490a-4079-b79c-700c6b51135c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.614767 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory" (OuterVolumeSpecName: "inventory") pod "62eaca86-490a-4079-b79c-700c6b51135c" (UID: "62eaca86-490a-4079-b79c-700c6b51135c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.689490 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.689514 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.689526 4546 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eaca86-490a-4079-b79c-700c6b51135c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:42 crc kubenswrapper[4546]: I0201 07:05:42.689534 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkp4c\" (UniqueName: \"kubernetes.io/projected/62eaca86-490a-4079-b79c-700c6b51135c-kube-api-access-rkp4c\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.104115 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" event={"ID":"62eaca86-490a-4079-b79c-700c6b51135c","Type":"ContainerDied","Data":"9d68ee4faf367f4eef46848cbbe223205440aad1ec3606297dd1446eb4dedbfe"} Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.104362 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d68ee4faf367f4eef46848cbbe223205440aad1ec3606297dd1446eb4dedbfe" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.104211 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt8k8" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.193732 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh"] Feb 01 07:05:43 crc kubenswrapper[4546]: E0201 07:05:43.194339 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eaca86-490a-4079-b79c-700c6b51135c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.194362 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eaca86-490a-4079-b79c-700c6b51135c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.194646 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eaca86-490a-4079-b79c-700c6b51135c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.195458 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.199528 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.200217 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.200514 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.200673 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.208375 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh"] Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.301065 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.302656 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qrg\" (UniqueName: \"kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.302938 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.405604 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.405775 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.405979 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47qrg\" (UniqueName: \"kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.413454 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.414133 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.419914 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qrg\" (UniqueName: \"kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:43 crc kubenswrapper[4546]: I0201 07:05:43.516260 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:05:44 crc kubenswrapper[4546]: I0201 07:05:44.000902 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh"] Feb 01 07:05:44 crc kubenswrapper[4546]: I0201 07:05:44.005800 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:05:44 crc kubenswrapper[4546]: I0201 07:05:44.115210 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" event={"ID":"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5","Type":"ContainerStarted","Data":"35effddae27788b477401c3411d30b74e2958ea76c5a402265ec08d9e0ae2099"} Feb 01 07:05:45 crc kubenswrapper[4546]: I0201 07:05:45.127950 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" event={"ID":"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5","Type":"ContainerStarted","Data":"6a5d660fae1d844938aab71e642c5a96e62e8843c3759b45aa33ba14de19c357"} Feb 01 07:05:45 crc kubenswrapper[4546]: I0201 07:05:45.156050 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" podStartSLOduration=1.571260408 podStartE2EDuration="2.156029357s" podCreationTimestamp="2026-02-01 07:05:43 +0000 UTC" firstStartedPulling="2026-02-01 07:05:44.005594195 +0000 UTC m=+1374.656530211" lastFinishedPulling="2026-02-01 07:05:44.590363144 +0000 UTC m=+1375.241299160" observedRunningTime="2026-02-01 07:05:45.148522022 +0000 UTC m=+1375.799458039" watchObservedRunningTime="2026-02-01 07:05:45.156029357 +0000 UTC m=+1375.806965372" Feb 01 07:05:54 crc kubenswrapper[4546]: I0201 07:05:54.924950 4546 scope.go:117] "RemoveContainer" containerID="c80b0bfc435be5a41c60880558336e3317d85bdb64376dc5b6efb4ac623bf962" Feb 01 07:05:54 crc kubenswrapper[4546]: I0201 07:05:54.950790 4546 scope.go:117] "RemoveContainer" containerID="fefa4df7d106b61293016f12588d6abd82428ce3e1b6fcab0480e7103410c527" Feb 01 07:05:54 crc kubenswrapper[4546]: I0201 07:05:54.968117 4546 scope.go:117] "RemoveContainer" containerID="e25dec80ca4d32fdfd281d21bd27a1e13d4c3fde6ce743d6799812758b279e79" Feb 01 07:05:54 crc kubenswrapper[4546]: I0201 07:05:54.986801 4546 scope.go:117] "RemoveContainer" containerID="bc86699e7d9725585b5f3f4777d648363be52b2a67d6ef9144d0e9e168eb67fe" Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.044600 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3734-account-create-update-h67dn"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.052076 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8d7b7"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.061946 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rsx9q"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.073350 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3734-account-create-update-h67dn"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.083056 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8fd9-account-create-update-cgcz9"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.090683 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rsx9q"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.096227 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8d7b7"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.115462 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8fd9-account-create-update-cgcz9"] Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.667197 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ff518a-27b9-4b86-a833-9e45e65104e2" path="/var/lib/kubelet/pods/31ff518a-27b9-4b86-a833-9e45e65104e2/volumes" Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.669564 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de11318-8cb2-44c8-ab01-41be6b3cd1c8" path="/var/lib/kubelet/pods/4de11318-8cb2-44c8-ab01-41be6b3cd1c8/volumes" Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.671395 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5724b55b-9e2b-481e-8850-95a521a27999" path="/var/lib/kubelet/pods/5724b55b-9e2b-481e-8850-95a521a27999/volumes" Feb 01 07:06:05 crc kubenswrapper[4546]: I0201 07:06:05.673112 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30dadf5-37b7-47b9-883f-556482c1e8d0" path="/var/lib/kubelet/pods/c30dadf5-37b7-47b9-883f-556482c1e8d0/volumes" Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.029086 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hs7bh"] Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.036550 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7ee6-account-create-update-pn854"] Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.044304 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hs7bh"] Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.050448 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7ee6-account-create-update-pn854"] Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.675454 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486e299d-c690-4f46-8197-e531c53a8b11" path="/var/lib/kubelet/pods/486e299d-c690-4f46-8197-e531c53a8b11/volumes" Feb 01 07:06:07 crc kubenswrapper[4546]: I0201 07:06:07.676676 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e916bb26-02e3-4748-b195-0bb5bf550f71" path="/var/lib/kubelet/pods/e916bb26-02e3-4748-b195-0bb5bf550f71/volumes" Feb 01 07:06:23 crc kubenswrapper[4546]: I0201 07:06:23.028210 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rt9zh"] Feb 01 07:06:23 crc kubenswrapper[4546]: I0201 07:06:23.036549 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rt9zh"] Feb 01 07:06:23 crc kubenswrapper[4546]: I0201 07:06:23.666432 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715ff32b-ef52-42e4-8f3c-7e88c6612a20" path="/var/lib/kubelet/pods/715ff32b-ef52-42e4-8f3c-7e88c6612a20/volumes" Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.051966 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-230e-account-create-update-scdj6"] Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.066845 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-26c6-account-create-update-kqwq2"] Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.080280 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-26c6-account-create-update-kqwq2"] Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.095181 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-230e-account-create-update-scdj6"] Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.669540 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f61fd99-8903-4fd1-a3ce-2c669ff13bd6" path="/var/lib/kubelet/pods/5f61fd99-8903-4fd1-a3ce-2c669ff13bd6/volumes" Feb 01 07:06:49 crc kubenswrapper[4546]: I0201 07:06:49.671441 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6d2ae5-f55b-484f-bc46-615a464741f2" path="/var/lib/kubelet/pods/9e6d2ae5-f55b-484f-bc46-615a464741f2/volumes" Feb 01 07:06:51 crc kubenswrapper[4546]: I0201 07:06:51.027281 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-thwg2"] Feb 01 07:06:51 crc kubenswrapper[4546]: I0201 07:06:51.033522 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-thwg2"] Feb 01 07:06:51 crc kubenswrapper[4546]: I0201 07:06:51.665643 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17" path="/var/lib/kubelet/pods/7e2769ab-3a4d-4a8d-bc1e-e85e20ab5e17/volumes" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.034826 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-9kz7g"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.041692 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-gv2cj"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.048411 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f812-account-create-update-62t9x"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.054036 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hgpc7"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.059975 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d77-account-create-update-r85z2"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.067309 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hgpc7"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.071148 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-gv2cj"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.076658 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d77-account-create-update-r85z2"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.081894 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f812-account-create-update-62t9x"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.090771 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-9kz7g"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.207702 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.212040 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.217143 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.312155 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2n9\" (UniqueName: \"kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.312300 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.312364 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.414126 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.414212 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2n9\" (UniqueName: \"kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.414313 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.414789 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.414827 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.431948 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2n9\" (UniqueName: \"kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9\") pod \"certified-operators-cftmf\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:52 crc kubenswrapper[4546]: I0201 07:06:52.532497 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.048942 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.663090 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13054411-2b0e-4c43-99c8-b10a5f7e6d07" path="/var/lib/kubelet/pods/13054411-2b0e-4c43-99c8-b10a5f7e6d07/volumes" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.664083 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327fb651-55a3-4732-98be-4e36956c7ff0" path="/var/lib/kubelet/pods/327fb651-55a3-4732-98be-4e36956c7ff0/volumes" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.664686 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62626f59-4035-4cc1-bcfb-219e7782df0b" path="/var/lib/kubelet/pods/62626f59-4035-4cc1-bcfb-219e7782df0b/volumes" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.665409 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a44a572-90bb-4589-b62d-7ffa43f490bc" path="/var/lib/kubelet/pods/8a44a572-90bb-4589-b62d-7ffa43f490bc/volumes" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.679035 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2844a5-3270-4037-a106-bd22aa315e85" path="/var/lib/kubelet/pods/fb2844a5-3270-4037-a106-bd22aa315e85/volumes" Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.792446 4546 generic.go:334] "Generic (PLEG): container finished" podID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerID="f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1" exitCode=0 Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.792547 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerDied","Data":"f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1"} Feb 01 07:06:53 crc kubenswrapper[4546]: I0201 07:06:53.792615 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerStarted","Data":"3253192657a9e7d0c7969880f78a1ddeda25f85d29054bd9f615c204dba9277d"} Feb 01 07:06:54 crc kubenswrapper[4546]: I0201 07:06:54.804518 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerStarted","Data":"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249"} Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.049652 4546 scope.go:117] "RemoveContainer" containerID="e70b3326ebad4e85b3cf2c4dd2a9a14a86c06473d000a357f1437d660a9f8322" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.074272 4546 scope.go:117] "RemoveContainer" containerID="41567098d151d3dad369b421cd0132a0ea4ba02628484b3a240766dfa076593e" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.122607 4546 scope.go:117] "RemoveContainer" containerID="ffae3c82fb2025796d0dd09d24e6ada2b325abe3a9b385877a082ad57ca51dfd" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.180093 4546 scope.go:117] "RemoveContainer" containerID="2f8f478fc42a4c5b40f2b8c9e8b63ccf464cc2944e337d903df0d33c1f4571fa" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.204141 4546 scope.go:117] "RemoveContainer" containerID="7139512aca28f1e82c87f008ae5b107d1d9fbf12a6b748751c0e3aace5ed7390" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.234251 4546 scope.go:117] "RemoveContainer" containerID="cbd618404a3c0c640ef1da47a81380e517d96a20c1dcbeed65fab5d8c94da8aa" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.254109 4546 scope.go:117] "RemoveContainer" containerID="2db4c7f532485ba6e41af2dbb05cb3e029aca30230c80ab58fdfbd775a81cdb5" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.271991 4546 scope.go:117] "RemoveContainer" containerID="3f0139c92efc216b33e1a122972ed585e59851589ef7bd375886afe857c61534" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.303722 4546 scope.go:117] "RemoveContainer" containerID="705fa0294061f140bf97c65a1f39858dfd393b235786d0643611d89db698e6a5" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.325744 4546 scope.go:117] "RemoveContainer" containerID="6a2bc8eaca085f28b306ca7616e42e02eb91939c79167a0f97c847eb5e8adaed" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.348738 4546 scope.go:117] "RemoveContainer" containerID="3fd182c0c1a7fc96ba75e18ef1b9ecac974ad33f103751f8002ae0932f083aa2" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.366683 4546 scope.go:117] "RemoveContainer" containerID="69b1130ae2ae4499049d91bb6ff8e1d20696717a491e1fdfe94d53f21721dd03" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.389531 4546 scope.go:117] "RemoveContainer" containerID="e9b6c6995fb811aedaa272c2f9041e4e526fb9e4824350e565a424423c687e95" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.418831 4546 scope.go:117] "RemoveContainer" containerID="ed60d86f7a325fd3673194488680e68299af922e70b3e097d01fa86d7357325b" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.420373 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.420464 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.446147 4546 scope.go:117] "RemoveContainer" containerID="56cc250e7e1e535454538883a5f716a1ab3060ee43ce7d3519d0cf3987409cb4" Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.825785 4546 generic.go:334] "Generic (PLEG): container finished" podID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerID="622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249" exitCode=0 Feb 01 07:06:55 crc kubenswrapper[4546]: I0201 07:06:55.825895 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerDied","Data":"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249"} Feb 01 07:06:56 crc kubenswrapper[4546]: I0201 07:06:56.841156 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerStarted","Data":"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04"} Feb 01 07:06:56 crc kubenswrapper[4546]: I0201 07:06:56.866221 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cftmf" podStartSLOduration=2.297447644 podStartE2EDuration="4.866190452s" podCreationTimestamp="2026-02-01 07:06:52 +0000 UTC" firstStartedPulling="2026-02-01 07:06:53.79464827 +0000 UTC m=+1444.445584286" lastFinishedPulling="2026-02-01 07:06:56.363391078 +0000 UTC m=+1447.014327094" observedRunningTime="2026-02-01 07:06:56.858086684 +0000 UTC m=+1447.509022700" watchObservedRunningTime="2026-02-01 07:06:56.866190452 +0000 UTC m=+1447.517126469" Feb 01 07:07:00 crc kubenswrapper[4546]: I0201 07:07:00.032602 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c7gjn"] Feb 01 07:07:00 crc kubenswrapper[4546]: I0201 07:07:00.041030 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c7gjn"] Feb 01 07:07:01 crc kubenswrapper[4546]: I0201 07:07:01.667065 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d22a574-dd06-478f-937c-6cec20a5777c" path="/var/lib/kubelet/pods/1d22a574-dd06-478f-937c-6cec20a5777c/volumes" Feb 01 07:07:02 crc kubenswrapper[4546]: I0201 07:07:02.533021 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:02 crc kubenswrapper[4546]: I0201 07:07:02.533096 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:02 crc kubenswrapper[4546]: I0201 07:07:02.575009 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:02 crc kubenswrapper[4546]: I0201 07:07:02.951940 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:03 crc kubenswrapper[4546]: I0201 07:07:03.013087 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:07:04 crc kubenswrapper[4546]: I0201 07:07:04.925557 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cftmf" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="registry-server" containerID="cri-o://5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04" gracePeriod=2 Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.027618 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sscjj"] Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.045589 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sscjj"] Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.372338 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.424937 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities\") pod \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.424984 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f2n9\" (UniqueName: \"kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9\") pod \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.425077 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content\") pod \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\" (UID: \"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805\") " Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.425767 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities" (OuterVolumeSpecName: "utilities") pod "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" (UID: "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.432212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9" (OuterVolumeSpecName: "kube-api-access-6f2n9") pod "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" (UID: "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805"). InnerVolumeSpecName "kube-api-access-6f2n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.467627 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" (UID: "56bd8fd3-ac8b-4d02-a080-dc5fa0c06805"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.526653 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.526688 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.526699 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f2n9\" (UniqueName: \"kubernetes.io/projected/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805-kube-api-access-6f2n9\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.665079 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf01534-1b7d-4f23-bc2c-02cb329a2036" path="/var/lib/kubelet/pods/2bf01534-1b7d-4f23-bc2c-02cb329a2036/volumes" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.971599 4546 generic.go:334] "Generic (PLEG): container finished" podID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerID="5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04" exitCode=0 Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.971930 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerDied","Data":"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04"} Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.971995 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cftmf" event={"ID":"56bd8fd3-ac8b-4d02-a080-dc5fa0c06805","Type":"ContainerDied","Data":"3253192657a9e7d0c7969880f78a1ddeda25f85d29054bd9f615c204dba9277d"} Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.972017 4546 scope.go:117] "RemoveContainer" containerID="5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.972474 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cftmf" Feb 01 07:07:05 crc kubenswrapper[4546]: I0201 07:07:05.993057 4546 scope.go:117] "RemoveContainer" containerID="622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.028471 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.048875 4546 scope.go:117] "RemoveContainer" containerID="f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.065382 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cftmf"] Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.119025 4546 scope.go:117] "RemoveContainer" containerID="5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04" Feb 01 07:07:06 crc kubenswrapper[4546]: E0201 07:07:06.125938 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04\": container with ID starting with 5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04 not found: ID does not exist" containerID="5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.125977 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04"} err="failed to get container status \"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04\": rpc error: code = NotFound desc = could not find container \"5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04\": container with ID starting with 5ee8c6a4f4a49da1b1a7e2a54a4664249f6c7814aecdc5fc099bd5336032ca04 not found: ID does not exist" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.126003 4546 scope.go:117] "RemoveContainer" containerID="622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249" Feb 01 07:07:06 crc kubenswrapper[4546]: E0201 07:07:06.128952 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249\": container with ID starting with 622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249 not found: ID does not exist" containerID="622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.128980 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249"} err="failed to get container status \"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249\": rpc error: code = NotFound desc = could not find container \"622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249\": container with ID starting with 622917a53ece82ac1ce21ac8c5cb9868a100e81685a5fc2ca56b1fad68cc0249 not found: ID does not exist" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.128998 4546 scope.go:117] "RemoveContainer" containerID="f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1" Feb 01 07:07:06 crc kubenswrapper[4546]: E0201 07:07:06.132508 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1\": container with ID starting with f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1 not found: ID does not exist" containerID="f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1" Feb 01 07:07:06 crc kubenswrapper[4546]: I0201 07:07:06.132537 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1"} err="failed to get container status \"f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1\": rpc error: code = NotFound desc = could not find container \"f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1\": container with ID starting with f0a72de8b350726462e6b7ae92b73184632a1349c049eac87cc232910c5b8ee1 not found: ID does not exist" Feb 01 07:07:07 crc kubenswrapper[4546]: I0201 07:07:07.666754 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" path="/var/lib/kubelet/pods/56bd8fd3-ac8b-4d02-a080-dc5fa0c06805/volumes" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.751994 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:16 crc kubenswrapper[4546]: E0201 07:07:16.752886 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="extract-content" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.752902 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="extract-content" Feb 01 07:07:16 crc kubenswrapper[4546]: E0201 07:07:16.752910 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="registry-server" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.752916 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="registry-server" Feb 01 07:07:16 crc kubenswrapper[4546]: E0201 07:07:16.752931 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="extract-utilities" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.752937 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="extract-utilities" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.753135 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bd8fd3-ac8b-4d02-a080-dc5fa0c06805" containerName="registry-server" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.758678 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.765628 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.844404 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.844494 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwp4s\" (UniqueName: \"kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.844900 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.946511 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwp4s\" (UniqueName: \"kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.946604 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.946677 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.947060 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.947081 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:16 crc kubenswrapper[4546]: I0201 07:07:16.971601 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwp4s\" (UniqueName: \"kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s\") pod \"redhat-marketplace-djb8m\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:17 crc kubenswrapper[4546]: I0201 07:07:17.073398 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:17 crc kubenswrapper[4546]: I0201 07:07:17.515927 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:18 crc kubenswrapper[4546]: I0201 07:07:18.087430 4546 generic.go:334] "Generic (PLEG): container finished" podID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerID="738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed" exitCode=0 Feb 01 07:07:18 crc kubenswrapper[4546]: I0201 07:07:18.087638 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerDied","Data":"738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed"} Feb 01 07:07:18 crc kubenswrapper[4546]: I0201 07:07:18.088923 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerStarted","Data":"4e0f88d88a6742f109f7f80ab02f92a68c610982505ccf8b9e5534baf2999c93"} Feb 01 07:07:19 crc kubenswrapper[4546]: I0201 07:07:19.100184 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerStarted","Data":"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f"} Feb 01 07:07:20 crc kubenswrapper[4546]: I0201 07:07:20.115137 4546 generic.go:334] "Generic (PLEG): container finished" podID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerID="ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f" exitCode=0 Feb 01 07:07:20 crc kubenswrapper[4546]: I0201 07:07:20.115383 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerDied","Data":"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f"} Feb 01 07:07:21 crc kubenswrapper[4546]: I0201 07:07:21.038877 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cd5px"] Feb 01 07:07:21 crc kubenswrapper[4546]: I0201 07:07:21.047544 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cd5px"] Feb 01 07:07:21 crc kubenswrapper[4546]: I0201 07:07:21.130535 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerStarted","Data":"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d"} Feb 01 07:07:21 crc kubenswrapper[4546]: I0201 07:07:21.155591 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-djb8m" podStartSLOduration=2.6240806660000002 podStartE2EDuration="5.15557328s" podCreationTimestamp="2026-02-01 07:07:16 +0000 UTC" firstStartedPulling="2026-02-01 07:07:18.089947792 +0000 UTC m=+1468.740883808" lastFinishedPulling="2026-02-01 07:07:20.621440406 +0000 UTC m=+1471.272376422" observedRunningTime="2026-02-01 07:07:21.147621716 +0000 UTC m=+1471.798557732" watchObservedRunningTime="2026-02-01 07:07:21.15557328 +0000 UTC m=+1471.806509295" Feb 01 07:07:21 crc kubenswrapper[4546]: I0201 07:07:21.667882 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19e5c53-445e-4852-80c6-7bce38282557" path="/var/lib/kubelet/pods/e19e5c53-445e-4852-80c6-7bce38282557/volumes" Feb 01 07:07:25 crc kubenswrapper[4546]: I0201 07:07:25.420782 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:07:25 crc kubenswrapper[4546]: I0201 07:07:25.421985 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:07:27 crc kubenswrapper[4546]: I0201 07:07:27.074205 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:27 crc kubenswrapper[4546]: I0201 07:07:27.074637 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:27 crc kubenswrapper[4546]: I0201 07:07:27.118329 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:27 crc kubenswrapper[4546]: I0201 07:07:27.224187 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:27 crc kubenswrapper[4546]: I0201 07:07:27.352095 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.199225 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-djb8m" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="registry-server" containerID="cri-o://6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d" gracePeriod=2 Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.608295 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.761431 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities\") pod \"2c014b41-84c7-4e50-a440-2ebd23c967e5\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.761478 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities" (OuterVolumeSpecName: "utilities") pod "2c014b41-84c7-4e50-a440-2ebd23c967e5" (UID: "2c014b41-84c7-4e50-a440-2ebd23c967e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.761747 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwp4s\" (UniqueName: \"kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s\") pod \"2c014b41-84c7-4e50-a440-2ebd23c967e5\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.761931 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content\") pod \"2c014b41-84c7-4e50-a440-2ebd23c967e5\" (UID: \"2c014b41-84c7-4e50-a440-2ebd23c967e5\") " Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.763428 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.769007 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s" (OuterVolumeSpecName: "kube-api-access-lwp4s") pod "2c014b41-84c7-4e50-a440-2ebd23c967e5" (UID: "2c014b41-84c7-4e50-a440-2ebd23c967e5"). InnerVolumeSpecName "kube-api-access-lwp4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.786267 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c014b41-84c7-4e50-a440-2ebd23c967e5" (UID: "2c014b41-84c7-4e50-a440-2ebd23c967e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.866564 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwp4s\" (UniqueName: \"kubernetes.io/projected/2c014b41-84c7-4e50-a440-2ebd23c967e5-kube-api-access-lwp4s\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:29 crc kubenswrapper[4546]: I0201 07:07:29.866599 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c014b41-84c7-4e50-a440-2ebd23c967e5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.209451 4546 generic.go:334] "Generic (PLEG): container finished" podID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerID="6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d" exitCode=0 Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.209505 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djb8m" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.209532 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerDied","Data":"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d"} Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.210377 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djb8m" event={"ID":"2c014b41-84c7-4e50-a440-2ebd23c967e5","Type":"ContainerDied","Data":"4e0f88d88a6742f109f7f80ab02f92a68c610982505ccf8b9e5534baf2999c93"} Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.210397 4546 scope.go:117] "RemoveContainer" containerID="6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.233583 4546 scope.go:117] "RemoveContainer" containerID="ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.254061 4546 scope.go:117] "RemoveContainer" containerID="738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.259072 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.275901 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-djb8m"] Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.296077 4546 scope.go:117] "RemoveContainer" containerID="6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d" Feb 01 07:07:30 crc kubenswrapper[4546]: E0201 07:07:30.321039 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d\": container with ID starting with 6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d not found: ID does not exist" containerID="6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.321086 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d"} err="failed to get container status \"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d\": rpc error: code = NotFound desc = could not find container \"6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d\": container with ID starting with 6476132ac79c347f47f7b323ab0de36599e7bb33850b1e723460a2600f14d25d not found: ID does not exist" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.321114 4546 scope.go:117] "RemoveContainer" containerID="ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f" Feb 01 07:07:30 crc kubenswrapper[4546]: E0201 07:07:30.328008 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f\": container with ID starting with ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f not found: ID does not exist" containerID="ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.328048 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f"} err="failed to get container status \"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f\": rpc error: code = NotFound desc = could not find container \"ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f\": container with ID starting with ca4acf014981108e96d5cd83a0b2f0c41c05dde8743bd06a1fc0de8ebeecc46f not found: ID does not exist" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.328072 4546 scope.go:117] "RemoveContainer" containerID="738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed" Feb 01 07:07:30 crc kubenswrapper[4546]: E0201 07:07:30.341423 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed\": container with ID starting with 738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed not found: ID does not exist" containerID="738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed" Feb 01 07:07:30 crc kubenswrapper[4546]: I0201 07:07:30.341471 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed"} err="failed to get container status \"738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed\": rpc error: code = NotFound desc = could not find container \"738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed\": container with ID starting with 738459436a875141d39714078a2f18ce09f47728e3ba73ec087ba147820d41ed not found: ID does not exist" Feb 01 07:07:31 crc kubenswrapper[4546]: I0201 07:07:31.665873 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" path="/var/lib/kubelet/pods/2c014b41-84c7-4e50-a440-2ebd23c967e5/volumes" Feb 01 07:07:47 crc kubenswrapper[4546]: I0201 07:07:47.063485 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-qjczq"] Feb 01 07:07:47 crc kubenswrapper[4546]: I0201 07:07:47.090075 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-qjczq"] Feb 01 07:07:47 crc kubenswrapper[4546]: I0201 07:07:47.668482 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af56bb5-2257-4f2f-97c8-a33236d55b81" path="/var/lib/kubelet/pods/7af56bb5-2257-4f2f-97c8-a33236d55b81/volumes" Feb 01 07:07:49 crc kubenswrapper[4546]: I0201 07:07:49.042275 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4wt8z"] Feb 01 07:07:49 crc kubenswrapper[4546]: I0201 07:07:49.050232 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4wt8z"] Feb 01 07:07:49 crc kubenswrapper[4546]: I0201 07:07:49.668369 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156aa66f-373e-4f1f-bcb5-4a764235a839" path="/var/lib/kubelet/pods/156aa66f-373e-4f1f-bcb5-4a764235a839/volumes" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.420878 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.421376 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.421418 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.422316 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.422376 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" gracePeriod=600 Feb 01 07:07:55 crc kubenswrapper[4546]: E0201 07:07:55.538045 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.706632 4546 scope.go:117] "RemoveContainer" containerID="d26c1a3c7b7135a987f7d5a19835eccee9bed2582a192dbe74791bf6131eec26" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.732428 4546 scope.go:117] "RemoveContainer" containerID="d9ce5a08c153effc0cb36d48340ca8be1974180bcec34eaa605af9177d079ebf" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.774966 4546 scope.go:117] "RemoveContainer" containerID="9e86461b8024e892cac94f2fcccea6cdb576941b61c420446695ed6de77ab5c0" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.810264 4546 scope.go:117] "RemoveContainer" containerID="ca676bfa1fe391f87550448426c1dbc286f9722ad540f53698167426dc53b6b8" Feb 01 07:07:55 crc kubenswrapper[4546]: I0201 07:07:55.881794 4546 scope.go:117] "RemoveContainer" containerID="e56b7a4aa8f6dcd5d20db3ec6730c32b18bf116f3bc1a0d4982702a6ef39fc61" Feb 01 07:07:56 crc kubenswrapper[4546]: I0201 07:07:56.487335 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" exitCode=0 Feb 01 07:07:56 crc kubenswrapper[4546]: I0201 07:07:56.487387 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f"} Feb 01 07:07:56 crc kubenswrapper[4546]: I0201 07:07:56.487669 4546 scope.go:117] "RemoveContainer" containerID="1d4ad86c403500757fcc4279e352c025d98a79c4116ab07f1b0bdf4a335f7d1e" Feb 01 07:07:56 crc kubenswrapper[4546]: I0201 07:07:56.488529 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:07:56 crc kubenswrapper[4546]: E0201 07:07:56.489049 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:08:06 crc kubenswrapper[4546]: I0201 07:08:06.591069 4546 generic.go:334] "Generic (PLEG): container finished" podID="5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" containerID="6a5d660fae1d844938aab71e642c5a96e62e8843c3759b45aa33ba14de19c357" exitCode=0 Feb 01 07:08:06 crc kubenswrapper[4546]: I0201 07:08:06.591284 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" event={"ID":"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5","Type":"ContainerDied","Data":"6a5d660fae1d844938aab71e642c5a96e62e8843c3759b45aa33ba14de19c357"} Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.037648 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.221294 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam\") pod \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.221387 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47qrg\" (UniqueName: \"kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg\") pod \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.221882 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory\") pod \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\" (UID: \"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5\") " Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.228081 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg" (OuterVolumeSpecName: "kube-api-access-47qrg") pod "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" (UID: "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5"). InnerVolumeSpecName "kube-api-access-47qrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.247432 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" (UID: "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.249092 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory" (OuterVolumeSpecName: "inventory") pod "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" (UID: "5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.325290 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.325321 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47qrg\" (UniqueName: \"kubernetes.io/projected/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-kube-api-access-47qrg\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.325333 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.613769 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" event={"ID":"5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5","Type":"ContainerDied","Data":"35effddae27788b477401c3411d30b74e2958ea76c5a402265ec08d9e0ae2099"} Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.613835 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ftdxh" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.613843 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35effddae27788b477401c3411d30b74e2958ea76c5a402265ec08d9e0ae2099" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698010 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws"] Feb 01 07:08:08 crc kubenswrapper[4546]: E0201 07:08:08.698478 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698498 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 01 07:08:08 crc kubenswrapper[4546]: E0201 07:08:08.698515 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="extract-content" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698521 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="extract-content" Feb 01 07:08:08 crc kubenswrapper[4546]: E0201 07:08:08.698530 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="registry-server" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698536 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="registry-server" Feb 01 07:08:08 crc kubenswrapper[4546]: E0201 07:08:08.698548 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="extract-utilities" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698553 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="extract-utilities" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698770 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3ce12b-54d5-481d-ad84-2e0f0fbca1f5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.698800 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c014b41-84c7-4e50-a440-2ebd23c967e5" containerName="registry-server" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.699489 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.702036 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.702329 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.702570 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.702702 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.720035 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws"] Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.836480 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tfz\" (UniqueName: \"kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.836565 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.836957 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.939291 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tfz\" (UniqueName: \"kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.939545 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.939949 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.944509 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.944665 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:08 crc kubenswrapper[4546]: I0201 07:08:08.954671 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tfz\" (UniqueName: \"kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:09 crc kubenswrapper[4546]: I0201 07:08:09.016222 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:08:09 crc kubenswrapper[4546]: I0201 07:08:09.510165 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws"] Feb 01 07:08:09 crc kubenswrapper[4546]: I0201 07:08:09.622809 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" event={"ID":"0669231c-f180-4931-b874-16f2ed38e2b4","Type":"ContainerStarted","Data":"4e3f0b74da1f467475c921c4d2c7c09ab551fc3a36db76308fa74c3933111091"} Feb 01 07:08:10 crc kubenswrapper[4546]: I0201 07:08:10.637253 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" event={"ID":"0669231c-f180-4931-b874-16f2ed38e2b4","Type":"ContainerStarted","Data":"3cc079fac5e713994755ee8227804c0dbd73ddf240fb3690b5618429accbd54b"} Feb 01 07:08:10 crc kubenswrapper[4546]: I0201 07:08:10.661808 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" podStartSLOduration=2.137806342 podStartE2EDuration="2.661788147s" podCreationTimestamp="2026-02-01 07:08:08 +0000 UTC" firstStartedPulling="2026-02-01 07:08:09.515845232 +0000 UTC m=+1520.166781248" lastFinishedPulling="2026-02-01 07:08:10.039827046 +0000 UTC m=+1520.690763053" observedRunningTime="2026-02-01 07:08:10.653719625 +0000 UTC m=+1521.304655630" watchObservedRunningTime="2026-02-01 07:08:10.661788147 +0000 UTC m=+1521.312724163" Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.063499 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b9btc"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.073331 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-6ktch"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.083404 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pgw6x"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.090493 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b9btc"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.095525 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-6ktch"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.101197 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pgw6x"] Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.656028 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:08:11 crc kubenswrapper[4546]: E0201 07:08:11.657037 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.672777 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c89483-60db-4db0-8957-32962d2a73b1" path="/var/lib/kubelet/pods/59c89483-60db-4db0-8957-32962d2a73b1/volumes" Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.674678 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b4a2956-c177-42f3-8981-830dbac77943" path="/var/lib/kubelet/pods/8b4a2956-c177-42f3-8981-830dbac77943/volumes" Feb 01 07:08:11 crc kubenswrapper[4546]: I0201 07:08:11.676717 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d86af3-9b64-4ebd-ac39-e2063ea7c9b6" path="/var/lib/kubelet/pods/91d86af3-9b64-4ebd-ac39-e2063ea7c9b6/volumes" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.013084 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.015711 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.025075 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.114975 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8d8p\" (UniqueName: \"kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.115033 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.115377 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.217086 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8d8p\" (UniqueName: \"kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.217128 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.217241 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.217752 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.218329 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.246887 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8d8p\" (UniqueName: \"kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p\") pod \"redhat-operators-d87mg\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.334244 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:16 crc kubenswrapper[4546]: I0201 07:08:16.820536 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:17 crc kubenswrapper[4546]: I0201 07:08:17.722988 4546 generic.go:334] "Generic (PLEG): container finished" podID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerID="5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498" exitCode=0 Feb 01 07:08:17 crc kubenswrapper[4546]: I0201 07:08:17.723035 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerDied","Data":"5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498"} Feb 01 07:08:17 crc kubenswrapper[4546]: I0201 07:08:17.724118 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerStarted","Data":"c8185a61aaf53cce278ac1a8d2f398fa0919991d5b884d43cf49dabff400760e"} Feb 01 07:08:18 crc kubenswrapper[4546]: I0201 07:08:18.739363 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerStarted","Data":"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461"} Feb 01 07:08:21 crc kubenswrapper[4546]: I0201 07:08:21.768129 4546 generic.go:334] "Generic (PLEG): container finished" podID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerID="1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461" exitCode=0 Feb 01 07:08:21 crc kubenswrapper[4546]: I0201 07:08:21.768211 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerDied","Data":"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461"} Feb 01 07:08:22 crc kubenswrapper[4546]: I0201 07:08:22.781015 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerStarted","Data":"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808"} Feb 01 07:08:22 crc kubenswrapper[4546]: I0201 07:08:22.798302 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d87mg" podStartSLOduration=3.312582129 podStartE2EDuration="7.798276588s" podCreationTimestamp="2026-02-01 07:08:15 +0000 UTC" firstStartedPulling="2026-02-01 07:08:17.725526461 +0000 UTC m=+1528.376462477" lastFinishedPulling="2026-02-01 07:08:22.21122092 +0000 UTC m=+1532.862156936" observedRunningTime="2026-02-01 07:08:22.796923878 +0000 UTC m=+1533.447859895" watchObservedRunningTime="2026-02-01 07:08:22.798276588 +0000 UTC m=+1533.449212605" Feb 01 07:08:26 crc kubenswrapper[4546]: I0201 07:08:26.334923 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:26 crc kubenswrapper[4546]: I0201 07:08:26.336413 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:26 crc kubenswrapper[4546]: I0201 07:08:26.655684 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:08:26 crc kubenswrapper[4546]: E0201 07:08:26.656418 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:08:27 crc kubenswrapper[4546]: I0201 07:08:27.373925 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d87mg" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="registry-server" probeResult="failure" output=< Feb 01 07:08:27 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:08:27 crc kubenswrapper[4546]: > Feb 01 07:08:36 crc kubenswrapper[4546]: I0201 07:08:36.377067 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:36 crc kubenswrapper[4546]: I0201 07:08:36.429429 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:36 crc kubenswrapper[4546]: I0201 07:08:36.619639 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:37 crc kubenswrapper[4546]: I0201 07:08:37.655108 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:08:37 crc kubenswrapper[4546]: E0201 07:08:37.655740 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:08:37 crc kubenswrapper[4546]: I0201 07:08:37.935563 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d87mg" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="registry-server" containerID="cri-o://63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808" gracePeriod=2 Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.383521 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.427086 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities\") pod \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.427342 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8d8p\" (UniqueName: \"kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p\") pod \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.427435 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content\") pod \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\" (UID: \"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421\") " Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.427778 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities" (OuterVolumeSpecName: "utilities") pod "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" (UID: "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.428195 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.435591 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p" (OuterVolumeSpecName: "kube-api-access-p8d8p") pod "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" (UID: "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421"). InnerVolumeSpecName "kube-api-access-p8d8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.531283 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8d8p\" (UniqueName: \"kubernetes.io/projected/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-kube-api-access-p8d8p\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.534834 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" (UID: "a8cf016c-4bdf-4e9a-83db-4ba1eb81f421"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.634066 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.949486 4546 generic.go:334] "Generic (PLEG): container finished" podID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerID="63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808" exitCode=0 Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.949534 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerDied","Data":"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808"} Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.949576 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d87mg" event={"ID":"a8cf016c-4bdf-4e9a-83db-4ba1eb81f421","Type":"ContainerDied","Data":"c8185a61aaf53cce278ac1a8d2f398fa0919991d5b884d43cf49dabff400760e"} Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.949572 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d87mg" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.949601 4546 scope.go:117] "RemoveContainer" containerID="63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.984719 4546 scope.go:117] "RemoveContainer" containerID="1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461" Feb 01 07:08:38 crc kubenswrapper[4546]: I0201 07:08:38.990963 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.015940 4546 scope.go:117] "RemoveContainer" containerID="5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.024388 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d87mg"] Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.052018 4546 scope.go:117] "RemoveContainer" containerID="63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808" Feb 01 07:08:39 crc kubenswrapper[4546]: E0201 07:08:39.052577 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808\": container with ID starting with 63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808 not found: ID does not exist" containerID="63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.052628 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808"} err="failed to get container status \"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808\": rpc error: code = NotFound desc = could not find container \"63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808\": container with ID starting with 63affcda4f62f6b65e9e5bc0605277885d19ca20b20e6db7979067761aafb808 not found: ID does not exist" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.052654 4546 scope.go:117] "RemoveContainer" containerID="1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461" Feb 01 07:08:39 crc kubenswrapper[4546]: E0201 07:08:39.053072 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461\": container with ID starting with 1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461 not found: ID does not exist" containerID="1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.053094 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461"} err="failed to get container status \"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461\": rpc error: code = NotFound desc = could not find container \"1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461\": container with ID starting with 1c7cbe5b4427584e9c83010dd895208e6ec42a6039fc4618507de84e1d348461 not found: ID does not exist" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.053110 4546 scope.go:117] "RemoveContainer" containerID="5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498" Feb 01 07:08:39 crc kubenswrapper[4546]: E0201 07:08:39.053496 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498\": container with ID starting with 5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498 not found: ID does not exist" containerID="5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.053623 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498"} err="failed to get container status \"5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498\": rpc error: code = NotFound desc = could not find container \"5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498\": container with ID starting with 5b7e3b14813d3979818560d6755460857efe57671eeceb0073f766850269b498 not found: ID does not exist" Feb 01 07:08:39 crc kubenswrapper[4546]: I0201 07:08:39.666664 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" path="/var/lib/kubelet/pods/a8cf016c-4bdf-4e9a-83db-4ba1eb81f421/volumes" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.532128 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:08:47 crc kubenswrapper[4546]: E0201 07:08:47.533132 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="extract-content" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.533146 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="extract-content" Feb 01 07:08:47 crc kubenswrapper[4546]: E0201 07:08:47.533166 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="registry-server" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.533172 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="registry-server" Feb 01 07:08:47 crc kubenswrapper[4546]: E0201 07:08:47.533205 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="extract-utilities" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.533212 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="extract-utilities" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.533401 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8cf016c-4bdf-4e9a-83db-4ba1eb81f421" containerName="registry-server" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.534850 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.550258 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.634015 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.634146 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr7hj\" (UniqueName: \"kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.634204 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.737743 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.738269 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.738448 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr7hj\" (UniqueName: \"kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.738542 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.739111 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.763484 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr7hj\" (UniqueName: \"kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj\") pod \"community-operators-cg7h7\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:47 crc kubenswrapper[4546]: I0201 07:08:47.857935 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:48 crc kubenswrapper[4546]: I0201 07:08:48.313358 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:08:48 crc kubenswrapper[4546]: W0201 07:08:48.320870 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e155e63_6eca_4a93_a51f_033c319a970a.slice/crio-4a9ebb2c2b492992d397b3babeff766fc3990cdf2eed14ffc3592e09620bf702 WatchSource:0}: Error finding container 4a9ebb2c2b492992d397b3babeff766fc3990cdf2eed14ffc3592e09620bf702: Status 404 returned error can't find the container with id 4a9ebb2c2b492992d397b3babeff766fc3990cdf2eed14ffc3592e09620bf702 Feb 01 07:08:48 crc kubenswrapper[4546]: I0201 07:08:48.654828 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:08:48 crc kubenswrapper[4546]: E0201 07:08:48.655461 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:08:49 crc kubenswrapper[4546]: I0201 07:08:49.055811 4546 generic.go:334] "Generic (PLEG): container finished" podID="6e155e63-6eca-4a93-a51f-033c319a970a" containerID="6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff" exitCode=0 Feb 01 07:08:49 crc kubenswrapper[4546]: I0201 07:08:49.055882 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerDied","Data":"6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff"} Feb 01 07:08:49 crc kubenswrapper[4546]: I0201 07:08:49.055915 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerStarted","Data":"4a9ebb2c2b492992d397b3babeff766fc3990cdf2eed14ffc3592e09620bf702"} Feb 01 07:08:50 crc kubenswrapper[4546]: I0201 07:08:50.072234 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerStarted","Data":"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d"} Feb 01 07:08:51 crc kubenswrapper[4546]: I0201 07:08:51.092072 4546 generic.go:334] "Generic (PLEG): container finished" podID="6e155e63-6eca-4a93-a51f-033c319a970a" containerID="99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d" exitCode=0 Feb 01 07:08:51 crc kubenswrapper[4546]: I0201 07:08:51.092457 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerDied","Data":"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d"} Feb 01 07:08:52 crc kubenswrapper[4546]: I0201 07:08:52.103969 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerStarted","Data":"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a"} Feb 01 07:08:52 crc kubenswrapper[4546]: I0201 07:08:52.129255 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cg7h7" podStartSLOduration=2.534687393 podStartE2EDuration="5.129230286s" podCreationTimestamp="2026-02-01 07:08:47 +0000 UTC" firstStartedPulling="2026-02-01 07:08:49.057823589 +0000 UTC m=+1559.708759604" lastFinishedPulling="2026-02-01 07:08:51.652366481 +0000 UTC m=+1562.303302497" observedRunningTime="2026-02-01 07:08:52.125480715 +0000 UTC m=+1562.776416731" watchObservedRunningTime="2026-02-01 07:08:52.129230286 +0000 UTC m=+1562.780166301" Feb 01 07:08:56 crc kubenswrapper[4546]: I0201 07:08:56.028852 4546 scope.go:117] "RemoveContainer" containerID="7387d0540462a56826d95378b0f343e5f40a5b9f2809ffe02c0191c1f245881e" Feb 01 07:08:56 crc kubenswrapper[4546]: I0201 07:08:56.067351 4546 scope.go:117] "RemoveContainer" containerID="0a4d32d91dc7b8a6390654f4a33444f520d817581ac9dd9e029b885d48bf0af0" Feb 01 07:08:56 crc kubenswrapper[4546]: I0201 07:08:56.126292 4546 scope.go:117] "RemoveContainer" containerID="8a1cfa49fdc5ff1dbc4a657cffc212f55d16123cd836ab421475783c61e3cad9" Feb 01 07:08:57 crc kubenswrapper[4546]: I0201 07:08:57.858895 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:57 crc kubenswrapper[4546]: I0201 07:08:57.858950 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:57 crc kubenswrapper[4546]: I0201 07:08:57.901970 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:58 crc kubenswrapper[4546]: I0201 07:08:58.196669 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:08:58 crc kubenswrapper[4546]: I0201 07:08:58.247327 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.178279 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cg7h7" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="registry-server" containerID="cri-o://e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a" gracePeriod=2 Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.600001 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.736849 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content\") pod \"6e155e63-6eca-4a93-a51f-033c319a970a\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.737144 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr7hj\" (UniqueName: \"kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj\") pod \"6e155e63-6eca-4a93-a51f-033c319a970a\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.737316 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities\") pod \"6e155e63-6eca-4a93-a51f-033c319a970a\" (UID: \"6e155e63-6eca-4a93-a51f-033c319a970a\") " Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.738188 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities" (OuterVolumeSpecName: "utilities") pod "6e155e63-6eca-4a93-a51f-033c319a970a" (UID: "6e155e63-6eca-4a93-a51f-033c319a970a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.741137 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.747134 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj" (OuterVolumeSpecName: "kube-api-access-gr7hj") pod "6e155e63-6eca-4a93-a51f-033c319a970a" (UID: "6e155e63-6eca-4a93-a51f-033c319a970a"). InnerVolumeSpecName "kube-api-access-gr7hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.781242 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e155e63-6eca-4a93-a51f-033c319a970a" (UID: "6e155e63-6eca-4a93-a51f-033c319a970a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.844243 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr7hj\" (UniqueName: \"kubernetes.io/projected/6e155e63-6eca-4a93-a51f-033c319a970a-kube-api-access-gr7hj\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:00 crc kubenswrapper[4546]: I0201 07:09:00.844290 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e155e63-6eca-4a93-a51f-033c319a970a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.192833 4546 generic.go:334] "Generic (PLEG): container finished" podID="6e155e63-6eca-4a93-a51f-033c319a970a" containerID="e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a" exitCode=0 Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.192895 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerDied","Data":"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a"} Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.192907 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cg7h7" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.192935 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cg7h7" event={"ID":"6e155e63-6eca-4a93-a51f-033c319a970a","Type":"ContainerDied","Data":"4a9ebb2c2b492992d397b3babeff766fc3990cdf2eed14ffc3592e09620bf702"} Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.192970 4546 scope.go:117] "RemoveContainer" containerID="e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.229102 4546 scope.go:117] "RemoveContainer" containerID="99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.235261 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.239826 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cg7h7"] Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.250410 4546 scope.go:117] "RemoveContainer" containerID="6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.277568 4546 scope.go:117] "RemoveContainer" containerID="e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a" Feb 01 07:09:01 crc kubenswrapper[4546]: E0201 07:09:01.278089 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a\": container with ID starting with e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a not found: ID does not exist" containerID="e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.278128 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a"} err="failed to get container status \"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a\": rpc error: code = NotFound desc = could not find container \"e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a\": container with ID starting with e249ebd42fe3d3791650c7406522402b769f580b09592d555ffa6ede9236a73a not found: ID does not exist" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.278153 4546 scope.go:117] "RemoveContainer" containerID="99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d" Feb 01 07:09:01 crc kubenswrapper[4546]: E0201 07:09:01.278382 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d\": container with ID starting with 99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d not found: ID does not exist" containerID="99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.278399 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d"} err="failed to get container status \"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d\": rpc error: code = NotFound desc = could not find container \"99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d\": container with ID starting with 99478e232299846d66c90c2e486592282f57aa40cdc82a347f68788ffa55637d not found: ID does not exist" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.278412 4546 scope.go:117] "RemoveContainer" containerID="6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff" Feb 01 07:09:01 crc kubenswrapper[4546]: E0201 07:09:01.278634 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff\": container with ID starting with 6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff not found: ID does not exist" containerID="6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.278661 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff"} err="failed to get container status \"6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff\": rpc error: code = NotFound desc = could not find container \"6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff\": container with ID starting with 6a7b7f3098ed34b266046c726b44fde7bff9fd8d0618a1edbd30516d426626ff not found: ID does not exist" Feb 01 07:09:01 crc kubenswrapper[4546]: I0201 07:09:01.666597 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" path="/var/lib/kubelet/pods/6e155e63-6eca-4a93-a51f-033c319a970a/volumes" Feb 01 07:09:03 crc kubenswrapper[4546]: I0201 07:09:03.654764 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:09:03 crc kubenswrapper[4546]: E0201 07:09:03.655334 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:09:08 crc kubenswrapper[4546]: I0201 07:09:08.046438 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wk829"] Feb 01 07:09:08 crc kubenswrapper[4546]: I0201 07:09:08.051539 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wk829"] Feb 01 07:09:08 crc kubenswrapper[4546]: I0201 07:09:08.057984 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4kbrz"] Feb 01 07:09:08 crc kubenswrapper[4546]: I0201 07:09:08.063059 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4kbrz"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.049611 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lvf28"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.064930 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cfac-account-create-update-lhf9t"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.074530 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9d33-account-create-update-jz942"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.085920 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lvf28"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.091089 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8edb-account-create-update-dzs8g"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.096712 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cfac-account-create-update-lhf9t"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.102219 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9d33-account-create-update-jz942"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.107790 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8edb-account-create-update-dzs8g"] Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.667253 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e59900e-a73b-4d2c-be24-130f43e15f6d" path="/var/lib/kubelet/pods/0e59900e-a73b-4d2c-be24-130f43e15f6d/volumes" Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.668622 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77705152-25fc-47d3-b448-00144a74f075" path="/var/lib/kubelet/pods/77705152-25fc-47d3-b448-00144a74f075/volumes" Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.669757 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab34a556-843f-4e9a-becd-82452d0ad83d" path="/var/lib/kubelet/pods/ab34a556-843f-4e9a-becd-82452d0ad83d/volumes" Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.670901 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ce9cfb-87da-40b6-9676-492ba3cce8b6" path="/var/lib/kubelet/pods/b2ce9cfb-87da-40b6-9676-492ba3cce8b6/volumes" Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.672170 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60f534b-2c84-4054-99c1-c682e0a58c7f" path="/var/lib/kubelet/pods/b60f534b-2c84-4054-99c1-c682e0a58c7f/volumes" Feb 01 07:09:09 crc kubenswrapper[4546]: I0201 07:09:09.672986 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b" path="/var/lib/kubelet/pods/ecae2b5c-8b1c-46cb-bcb2-b544da7ec29b/volumes" Feb 01 07:09:14 crc kubenswrapper[4546]: I0201 07:09:14.329525 4546 generic.go:334] "Generic (PLEG): container finished" podID="0669231c-f180-4931-b874-16f2ed38e2b4" containerID="3cc079fac5e713994755ee8227804c0dbd73ddf240fb3690b5618429accbd54b" exitCode=0 Feb 01 07:09:14 crc kubenswrapper[4546]: I0201 07:09:14.329718 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" event={"ID":"0669231c-f180-4931-b874-16f2ed38e2b4","Type":"ContainerDied","Data":"3cc079fac5e713994755ee8227804c0dbd73ddf240fb3690b5618429accbd54b"} Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.659642 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:09:15 crc kubenswrapper[4546]: E0201 07:09:15.659963 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.820796 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.920754 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam\") pod \"0669231c-f180-4931-b874-16f2ed38e2b4\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.920813 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tfz\" (UniqueName: \"kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz\") pod \"0669231c-f180-4931-b874-16f2ed38e2b4\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.920994 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory\") pod \"0669231c-f180-4931-b874-16f2ed38e2b4\" (UID: \"0669231c-f180-4931-b874-16f2ed38e2b4\") " Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.947131 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz" (OuterVolumeSpecName: "kube-api-access-p7tfz") pod "0669231c-f180-4931-b874-16f2ed38e2b4" (UID: "0669231c-f180-4931-b874-16f2ed38e2b4"). InnerVolumeSpecName "kube-api-access-p7tfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:15 crc kubenswrapper[4546]: I0201 07:09:15.989967 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory" (OuterVolumeSpecName: "inventory") pod "0669231c-f180-4931-b874-16f2ed38e2b4" (UID: "0669231c-f180-4931-b874-16f2ed38e2b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.019974 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0669231c-f180-4931-b874-16f2ed38e2b4" (UID: "0669231c-f180-4931-b874-16f2ed38e2b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.022211 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.022238 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0669231c-f180-4931-b874-16f2ed38e2b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.022252 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tfz\" (UniqueName: \"kubernetes.io/projected/0669231c-f180-4931-b874-16f2ed38e2b4-kube-api-access-p7tfz\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.384215 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" event={"ID":"0669231c-f180-4931-b874-16f2ed38e2b4","Type":"ContainerDied","Data":"4e3f0b74da1f467475c921c4d2c7c09ab551fc3a36db76308fa74c3933111091"} Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.384261 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e3f0b74da1f467475c921c4d2c7c09ab551fc3a36db76308fa74c3933111091" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.384319 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tq9ws" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.441355 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79"] Feb 01 07:09:16 crc kubenswrapper[4546]: E0201 07:09:16.441789 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="extract-content" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.441811 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="extract-content" Feb 01 07:09:16 crc kubenswrapper[4546]: E0201 07:09:16.441826 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="registry-server" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.441832 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="registry-server" Feb 01 07:09:16 crc kubenswrapper[4546]: E0201 07:09:16.441878 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0669231c-f180-4931-b874-16f2ed38e2b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.441886 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0669231c-f180-4931-b874-16f2ed38e2b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:16 crc kubenswrapper[4546]: E0201 07:09:16.441899 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="extract-utilities" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.441904 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="extract-utilities" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.442111 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0669231c-f180-4931-b874-16f2ed38e2b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.442131 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e155e63-6eca-4a93-a51f-033c319a970a" containerName="registry-server" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.442773 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.444936 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.450281 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.450350 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.451221 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.456546 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79"] Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.531941 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.532042 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.532417 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nzj\" (UniqueName: \"kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.635605 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.635976 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87nzj\" (UniqueName: \"kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.636157 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.640221 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.640220 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.653076 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nzj\" (UniqueName: \"kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-45v79\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:16 crc kubenswrapper[4546]: I0201 07:09:16.765952 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:17 crc kubenswrapper[4546]: I0201 07:09:17.286848 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79"] Feb 01 07:09:17 crc kubenswrapper[4546]: I0201 07:09:17.392707 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" event={"ID":"f529131b-44c3-4899-aefb-ef023cd17a19","Type":"ContainerStarted","Data":"27c691431d7258bfab1e2304427eed2fce798fd80b8ab1d08287764f990f61ae"} Feb 01 07:09:18 crc kubenswrapper[4546]: I0201 07:09:18.406654 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" event={"ID":"f529131b-44c3-4899-aefb-ef023cd17a19","Type":"ContainerStarted","Data":"cc9ef068f2c8760d1c64772326ec92d91d7131eb88e48f7120208733f69177ed"} Feb 01 07:09:18 crc kubenswrapper[4546]: I0201 07:09:18.427981 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" podStartSLOduration=1.915372321 podStartE2EDuration="2.4279594s" podCreationTimestamp="2026-02-01 07:09:16 +0000 UTC" firstStartedPulling="2026-02-01 07:09:17.292359597 +0000 UTC m=+1587.943295613" lastFinishedPulling="2026-02-01 07:09:17.804946676 +0000 UTC m=+1588.455882692" observedRunningTime="2026-02-01 07:09:18.421953977 +0000 UTC m=+1589.072889994" watchObservedRunningTime="2026-02-01 07:09:18.4279594 +0000 UTC m=+1589.078895416" Feb 01 07:09:22 crc kubenswrapper[4546]: I0201 07:09:22.443581 4546 generic.go:334] "Generic (PLEG): container finished" podID="f529131b-44c3-4899-aefb-ef023cd17a19" containerID="cc9ef068f2c8760d1c64772326ec92d91d7131eb88e48f7120208733f69177ed" exitCode=0 Feb 01 07:09:22 crc kubenswrapper[4546]: I0201 07:09:22.443680 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" event={"ID":"f529131b-44c3-4899-aefb-ef023cd17a19","Type":"ContainerDied","Data":"cc9ef068f2c8760d1c64772326ec92d91d7131eb88e48f7120208733f69177ed"} Feb 01 07:09:23 crc kubenswrapper[4546]: I0201 07:09:23.898341 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.004840 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory\") pod \"f529131b-44c3-4899-aefb-ef023cd17a19\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.004986 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam\") pod \"f529131b-44c3-4899-aefb-ef023cd17a19\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.005078 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nzj\" (UniqueName: \"kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj\") pod \"f529131b-44c3-4899-aefb-ef023cd17a19\" (UID: \"f529131b-44c3-4899-aefb-ef023cd17a19\") " Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.021068 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj" (OuterVolumeSpecName: "kube-api-access-87nzj") pod "f529131b-44c3-4899-aefb-ef023cd17a19" (UID: "f529131b-44c3-4899-aefb-ef023cd17a19"). InnerVolumeSpecName "kube-api-access-87nzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.037615 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory" (OuterVolumeSpecName: "inventory") pod "f529131b-44c3-4899-aefb-ef023cd17a19" (UID: "f529131b-44c3-4899-aefb-ef023cd17a19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.045125 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f529131b-44c3-4899-aefb-ef023cd17a19" (UID: "f529131b-44c3-4899-aefb-ef023cd17a19"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.107313 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.107598 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f529131b-44c3-4899-aefb-ef023cd17a19-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.107612 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87nzj\" (UniqueName: \"kubernetes.io/projected/f529131b-44c3-4899-aefb-ef023cd17a19-kube-api-access-87nzj\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.463384 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" event={"ID":"f529131b-44c3-4899-aefb-ef023cd17a19","Type":"ContainerDied","Data":"27c691431d7258bfab1e2304427eed2fce798fd80b8ab1d08287764f990f61ae"} Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.463450 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c691431d7258bfab1e2304427eed2fce798fd80b8ab1d08287764f990f61ae" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.463454 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-45v79" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.540319 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw"] Feb 01 07:09:24 crc kubenswrapper[4546]: E0201 07:09:24.540759 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f529131b-44c3-4899-aefb-ef023cd17a19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.540788 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f529131b-44c3-4899-aefb-ef023cd17a19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.540993 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f529131b-44c3-4899-aefb-ef023cd17a19" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.541630 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.545175 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.545433 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.545598 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.545886 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.561905 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw"] Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.722822 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.723214 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.723430 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769mk\" (UniqueName: \"kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.825105 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.825191 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769mk\" (UniqueName: \"kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.825334 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.830331 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.832729 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.845598 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769mk\" (UniqueName: \"kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rz7lw\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:24 crc kubenswrapper[4546]: I0201 07:09:24.855576 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:25 crc kubenswrapper[4546]: I0201 07:09:25.383822 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw"] Feb 01 07:09:25 crc kubenswrapper[4546]: I0201 07:09:25.475079 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" event={"ID":"74ab71b9-867a-42a9-8201-f11eb7cb330c","Type":"ContainerStarted","Data":"84dcdeb5926e399ac741226bf6920d15970af09b57f92cbea073f8359e45f01c"} Feb 01 07:09:26 crc kubenswrapper[4546]: I0201 07:09:26.486030 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" event={"ID":"74ab71b9-867a-42a9-8201-f11eb7cb330c","Type":"ContainerStarted","Data":"034e577a118436e4b99ff39e81edc9f5c9280c780021fc6042e0fdf78bf3d1e9"} Feb 01 07:09:26 crc kubenswrapper[4546]: I0201 07:09:26.508663 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" podStartSLOduration=2.034600331 podStartE2EDuration="2.508646514s" podCreationTimestamp="2026-02-01 07:09:24 +0000 UTC" firstStartedPulling="2026-02-01 07:09:25.391606123 +0000 UTC m=+1596.042542129" lastFinishedPulling="2026-02-01 07:09:25.865652295 +0000 UTC m=+1596.516588312" observedRunningTime="2026-02-01 07:09:26.499704614 +0000 UTC m=+1597.150640630" watchObservedRunningTime="2026-02-01 07:09:26.508646514 +0000 UTC m=+1597.159582530" Feb 01 07:09:29 crc kubenswrapper[4546]: I0201 07:09:29.659598 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:09:29 crc kubenswrapper[4546]: E0201 07:09:29.660668 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:09:36 crc kubenswrapper[4546]: I0201 07:09:36.045638 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fvmdz"] Feb 01 07:09:36 crc kubenswrapper[4546]: I0201 07:09:36.058819 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fvmdz"] Feb 01 07:09:37 crc kubenswrapper[4546]: I0201 07:09:37.665630 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69960492-73f4-4adf-94c6-f8f6ea237503" path="/var/lib/kubelet/pods/69960492-73f4-4adf-94c6-f8f6ea237503/volumes" Feb 01 07:09:41 crc kubenswrapper[4546]: I0201 07:09:41.655816 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:09:41 crc kubenswrapper[4546]: E0201 07:09:41.656828 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:09:52 crc kubenswrapper[4546]: I0201 07:09:52.655516 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:09:52 crc kubenswrapper[4546]: E0201 07:09:52.656638 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.235440 4546 scope.go:117] "RemoveContainer" containerID="34e90168d4d7b9f5e17f2abe8baf07cc356ed0e18d9351910cae5edfc03efcec" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.264077 4546 scope.go:117] "RemoveContainer" containerID="832c4e5e70e268faf40ffc5ca3cca6f3a014fbe7ee29f7231121bb26bb237701" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.307510 4546 scope.go:117] "RemoveContainer" containerID="bcafe1ab9e5c4ce3f8238bb18507759966fde00caa59ff2e1f79e841a950f01e" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.349048 4546 scope.go:117] "RemoveContainer" containerID="730f1d5d3c50e0bc4c7ff5c194eb5f8ec07c3d3089a8f4a08fef77bb51f056ef" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.374047 4546 scope.go:117] "RemoveContainer" containerID="f6d72ac5c09ae960f171fdfd21cbe511552cc2f3994e4cda668dc78bf8381031" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.418104 4546 scope.go:117] "RemoveContainer" containerID="d7186a6dac4bf0bb0c2e5151c6b3c2e14f328f4a1fa681ca306d410166ac5f18" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.446741 4546 scope.go:117] "RemoveContainer" containerID="9067223864ae5d28608217cc1cfc5125d7a838015e0d219c726a183ba312df27" Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.787965 4546 generic.go:334] "Generic (PLEG): container finished" podID="74ab71b9-867a-42a9-8201-f11eb7cb330c" containerID="034e577a118436e4b99ff39e81edc9f5c9280c780021fc6042e0fdf78bf3d1e9" exitCode=0 Feb 01 07:09:56 crc kubenswrapper[4546]: I0201 07:09:56.788022 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" event={"ID":"74ab71b9-867a-42a9-8201-f11eb7cb330c","Type":"ContainerDied","Data":"034e577a118436e4b99ff39e81edc9f5c9280c780021fc6042e0fdf78bf3d1e9"} Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.081541 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k6q2g"] Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.097779 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k6q2g"] Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.194847 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.345900 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769mk\" (UniqueName: \"kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk\") pod \"74ab71b9-867a-42a9-8201-f11eb7cb330c\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.345967 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory\") pod \"74ab71b9-867a-42a9-8201-f11eb7cb330c\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.346140 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam\") pod \"74ab71b9-867a-42a9-8201-f11eb7cb330c\" (UID: \"74ab71b9-867a-42a9-8201-f11eb7cb330c\") " Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.352195 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk" (OuterVolumeSpecName: "kube-api-access-769mk") pod "74ab71b9-867a-42a9-8201-f11eb7cb330c" (UID: "74ab71b9-867a-42a9-8201-f11eb7cb330c"). InnerVolumeSpecName "kube-api-access-769mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.372172 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory" (OuterVolumeSpecName: "inventory") pod "74ab71b9-867a-42a9-8201-f11eb7cb330c" (UID: "74ab71b9-867a-42a9-8201-f11eb7cb330c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.375059 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "74ab71b9-867a-42a9-8201-f11eb7cb330c" (UID: "74ab71b9-867a-42a9-8201-f11eb7cb330c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.449507 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-769mk\" (UniqueName: \"kubernetes.io/projected/74ab71b9-867a-42a9-8201-f11eb7cb330c-kube-api-access-769mk\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.449547 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.449561 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74ab71b9-867a-42a9-8201-f11eb7cb330c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.810906 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" event={"ID":"74ab71b9-867a-42a9-8201-f11eb7cb330c","Type":"ContainerDied","Data":"84dcdeb5926e399ac741226bf6920d15970af09b57f92cbea073f8359e45f01c"} Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.811074 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84dcdeb5926e399ac741226bf6920d15970af09b57f92cbea073f8359e45f01c" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.811410 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rz7lw" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.902043 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg"] Feb 01 07:09:58 crc kubenswrapper[4546]: E0201 07:09:58.902609 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ab71b9-867a-42a9-8201-f11eb7cb330c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.902701 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ab71b9-867a-42a9-8201-f11eb7cb330c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.903012 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ab71b9-867a-42a9-8201-f11eb7cb330c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.903809 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.909016 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.909963 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.911377 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg"] Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.916565 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.916592 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.961404 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q45\" (UniqueName: \"kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.961486 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:58 crc kubenswrapper[4546]: I0201 07:09:58.961537 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.063864 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q45\" (UniqueName: \"kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.064237 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.064305 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.068070 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.069293 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.080492 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q45\" (UniqueName: \"kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.220132 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.666110 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b02eabc-af33-4e6e-8e03-e95876644ea7" path="/var/lib/kubelet/pods/4b02eabc-af33-4e6e-8e03-e95876644ea7/volumes" Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.688617 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg"] Feb 01 07:09:59 crc kubenswrapper[4546]: I0201 07:09:59.819558 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" event={"ID":"cd86c7f5-3661-4744-8cff-69341ae0ae33","Type":"ContainerStarted","Data":"7c9ecee26e236164bf1997c19ec3ccaaec1c01a8c5e93a90b2c25779a8358617"} Feb 01 07:10:00 crc kubenswrapper[4546]: I0201 07:10:00.030334 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rq7cx"] Feb 01 07:10:00 crc kubenswrapper[4546]: I0201 07:10:00.039216 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rq7cx"] Feb 01 07:10:00 crc kubenswrapper[4546]: I0201 07:10:00.846902 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" event={"ID":"cd86c7f5-3661-4744-8cff-69341ae0ae33","Type":"ContainerStarted","Data":"17f18fa5eff504ae67ac281123f6cbbaaa70833e927c09e70f148641a58e4959"} Feb 01 07:10:00 crc kubenswrapper[4546]: I0201 07:10:00.877824 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" podStartSLOduration=2.341770384 podStartE2EDuration="2.877802399s" podCreationTimestamp="2026-02-01 07:09:58 +0000 UTC" firstStartedPulling="2026-02-01 07:09:59.696880842 +0000 UTC m=+1630.347816857" lastFinishedPulling="2026-02-01 07:10:00.232912856 +0000 UTC m=+1630.883848872" observedRunningTime="2026-02-01 07:10:00.872742077 +0000 UTC m=+1631.523678093" watchObservedRunningTime="2026-02-01 07:10:00.877802399 +0000 UTC m=+1631.528738415" Feb 01 07:10:01 crc kubenswrapper[4546]: I0201 07:10:01.667075 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdf5e3f-6e33-4f70-95e1-c54b7c97df47" path="/var/lib/kubelet/pods/8fdf5e3f-6e33-4f70-95e1-c54b7c97df47/volumes" Feb 01 07:10:04 crc kubenswrapper[4546]: I0201 07:10:04.655124 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:10:04 crc kubenswrapper[4546]: E0201 07:10:04.657141 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:10:17 crc kubenswrapper[4546]: I0201 07:10:17.655067 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:10:17 crc kubenswrapper[4546]: E0201 07:10:17.655843 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:10:29 crc kubenswrapper[4546]: I0201 07:10:29.661125 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:10:29 crc kubenswrapper[4546]: E0201 07:10:29.662288 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:10:37 crc kubenswrapper[4546]: I0201 07:10:37.221812 4546 generic.go:334] "Generic (PLEG): container finished" podID="cd86c7f5-3661-4744-8cff-69341ae0ae33" containerID="17f18fa5eff504ae67ac281123f6cbbaaa70833e927c09e70f148641a58e4959" exitCode=0 Feb 01 07:10:37 crc kubenswrapper[4546]: I0201 07:10:37.222506 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" event={"ID":"cd86c7f5-3661-4744-8cff-69341ae0ae33","Type":"ContainerDied","Data":"17f18fa5eff504ae67ac281123f6cbbaaa70833e927c09e70f148641a58e4959"} Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.604924 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.695270 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam\") pod \"cd86c7f5-3661-4744-8cff-69341ae0ae33\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.695699 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory\") pod \"cd86c7f5-3661-4744-8cff-69341ae0ae33\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.695850 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q45\" (UniqueName: \"kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45\") pod \"cd86c7f5-3661-4744-8cff-69341ae0ae33\" (UID: \"cd86c7f5-3661-4744-8cff-69341ae0ae33\") " Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.702499 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45" (OuterVolumeSpecName: "kube-api-access-k5q45") pod "cd86c7f5-3661-4744-8cff-69341ae0ae33" (UID: "cd86c7f5-3661-4744-8cff-69341ae0ae33"). InnerVolumeSpecName "kube-api-access-k5q45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.728299 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory" (OuterVolumeSpecName: "inventory") pod "cd86c7f5-3661-4744-8cff-69341ae0ae33" (UID: "cd86c7f5-3661-4744-8cff-69341ae0ae33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.752022 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cd86c7f5-3661-4744-8cff-69341ae0ae33" (UID: "cd86c7f5-3661-4744-8cff-69341ae0ae33"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.802082 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.802120 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q45\" (UniqueName: \"kubernetes.io/projected/cd86c7f5-3661-4744-8cff-69341ae0ae33-kube-api-access-k5q45\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:38 crc kubenswrapper[4546]: I0201 07:10:38.802136 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd86c7f5-3661-4744-8cff-69341ae0ae33-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.245728 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" event={"ID":"cd86c7f5-3661-4744-8cff-69341ae0ae33","Type":"ContainerDied","Data":"7c9ecee26e236164bf1997c19ec3ccaaec1c01a8c5e93a90b2c25779a8358617"} Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.246183 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9ecee26e236164bf1997c19ec3ccaaec1c01a8c5e93a90b2c25779a8358617" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.246087 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7jpqg" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.346081 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dvp5n"] Feb 01 07:10:39 crc kubenswrapper[4546]: E0201 07:10:39.346614 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd86c7f5-3661-4744-8cff-69341ae0ae33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.346636 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd86c7f5-3661-4744-8cff-69341ae0ae33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.346889 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd86c7f5-3661-4744-8cff-69341ae0ae33" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.347585 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.349504 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.349722 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.350779 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.351256 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.360039 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dvp5n"] Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.419317 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhswp\" (UniqueName: \"kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.419394 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.419798 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.521576 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.521659 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhswp\" (UniqueName: \"kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.521691 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.528148 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.528186 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.539041 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhswp\" (UniqueName: \"kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp\") pod \"ssh-known-hosts-edpm-deployment-dvp5n\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:39 crc kubenswrapper[4546]: I0201 07:10:39.683304 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:40 crc kubenswrapper[4546]: I0201 07:10:40.381568 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dvp5n"] Feb 01 07:10:41 crc kubenswrapper[4546]: I0201 07:10:41.276424 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" event={"ID":"3452cdf3-d990-497f-bd27-3572c9a08d85","Type":"ContainerStarted","Data":"2b21233e7c4723aa4df9497736865fe38cfc0c922c257e8ab5e8a60801efca30"} Feb 01 07:10:41 crc kubenswrapper[4546]: I0201 07:10:41.276531 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" event={"ID":"3452cdf3-d990-497f-bd27-3572c9a08d85","Type":"ContainerStarted","Data":"099e6fe152f8ef038669d4e736daad24858cc2aeda9c9c09e5c6f70da19ef1f5"} Feb 01 07:10:41 crc kubenswrapper[4546]: I0201 07:10:41.303427 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" podStartSLOduration=1.8129818659999999 podStartE2EDuration="2.30340784s" podCreationTimestamp="2026-02-01 07:10:39 +0000 UTC" firstStartedPulling="2026-02-01 07:10:40.402914734 +0000 UTC m=+1671.053850751" lastFinishedPulling="2026-02-01 07:10:40.893340709 +0000 UTC m=+1671.544276725" observedRunningTime="2026-02-01 07:10:41.296031713 +0000 UTC m=+1671.946967718" watchObservedRunningTime="2026-02-01 07:10:41.30340784 +0000 UTC m=+1671.954343856" Feb 01 07:10:44 crc kubenswrapper[4546]: I0201 07:10:44.043933 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-qldlq"] Feb 01 07:10:44 crc kubenswrapper[4546]: I0201 07:10:44.053514 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-qldlq"] Feb 01 07:10:44 crc kubenswrapper[4546]: I0201 07:10:44.654784 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:10:44 crc kubenswrapper[4546]: E0201 07:10:44.655261 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:10:45 crc kubenswrapper[4546]: I0201 07:10:45.668707 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11be2508-ee43-420e-83f9-bb37921807d8" path="/var/lib/kubelet/pods/11be2508-ee43-420e-83f9-bb37921807d8/volumes" Feb 01 07:10:47 crc kubenswrapper[4546]: I0201 07:10:47.335640 4546 generic.go:334] "Generic (PLEG): container finished" podID="3452cdf3-d990-497f-bd27-3572c9a08d85" containerID="2b21233e7c4723aa4df9497736865fe38cfc0c922c257e8ab5e8a60801efca30" exitCode=0 Feb 01 07:10:47 crc kubenswrapper[4546]: I0201 07:10:47.335720 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" event={"ID":"3452cdf3-d990-497f-bd27-3572c9a08d85","Type":"ContainerDied","Data":"2b21233e7c4723aa4df9497736865fe38cfc0c922c257e8ab5e8a60801efca30"} Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.696949 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.853109 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhswp\" (UniqueName: \"kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp\") pod \"3452cdf3-d990-497f-bd27-3572c9a08d85\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.853534 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0\") pod \"3452cdf3-d990-497f-bd27-3572c9a08d85\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.853574 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam\") pod \"3452cdf3-d990-497f-bd27-3572c9a08d85\" (UID: \"3452cdf3-d990-497f-bd27-3572c9a08d85\") " Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.865548 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp" (OuterVolumeSpecName: "kube-api-access-rhswp") pod "3452cdf3-d990-497f-bd27-3572c9a08d85" (UID: "3452cdf3-d990-497f-bd27-3572c9a08d85"). InnerVolumeSpecName "kube-api-access-rhswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.883591 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3452cdf3-d990-497f-bd27-3572c9a08d85" (UID: "3452cdf3-d990-497f-bd27-3572c9a08d85"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.904257 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3452cdf3-d990-497f-bd27-3572c9a08d85" (UID: "3452cdf3-d990-497f-bd27-3572c9a08d85"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.956725 4546 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.956986 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3452cdf3-d990-497f-bd27-3572c9a08d85-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:48 crc kubenswrapper[4546]: I0201 07:10:48.957003 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhswp\" (UniqueName: \"kubernetes.io/projected/3452cdf3-d990-497f-bd27-3572c9a08d85-kube-api-access-rhswp\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.365099 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" event={"ID":"3452cdf3-d990-497f-bd27-3572c9a08d85","Type":"ContainerDied","Data":"099e6fe152f8ef038669d4e736daad24858cc2aeda9c9c09e5c6f70da19ef1f5"} Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.365160 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099e6fe152f8ef038669d4e736daad24858cc2aeda9c9c09e5c6f70da19ef1f5" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.365263 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dvp5n" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.425708 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp"] Feb 01 07:10:49 crc kubenswrapper[4546]: E0201 07:10:49.426414 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3452cdf3-d990-497f-bd27-3572c9a08d85" containerName="ssh-known-hosts-edpm-deployment" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.426438 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3452cdf3-d990-497f-bd27-3572c9a08d85" containerName="ssh-known-hosts-edpm-deployment" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.426685 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3452cdf3-d990-497f-bd27-3572c9a08d85" containerName="ssh-known-hosts-edpm-deployment" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.427952 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.430707 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.431045 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.431220 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.432404 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.443185 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp"] Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.465659 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.466079 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.466347 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght6d\" (UniqueName: \"kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.568992 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.569228 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght6d\" (UniqueName: \"kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.569382 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.574284 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.574994 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.586237 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght6d\" (UniqueName: \"kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fnstp\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.746642 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:10:49 crc kubenswrapper[4546]: I0201 07:10:49.755378 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:50 crc kubenswrapper[4546]: I0201 07:10:50.287820 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:10:50 crc kubenswrapper[4546]: I0201 07:10:50.289383 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp"] Feb 01 07:10:50 crc kubenswrapper[4546]: I0201 07:10:50.377420 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" event={"ID":"10bbfd2b-7d92-4730-a3e8-2a261b64b477","Type":"ContainerStarted","Data":"799daa635b4ee01d5e49acbe5892975911db013cd13594b02a870e8837b88d3e"} Feb 01 07:10:50 crc kubenswrapper[4546]: I0201 07:10:50.801269 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:10:51 crc kubenswrapper[4546]: I0201 07:10:51.386736 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" event={"ID":"10bbfd2b-7d92-4730-a3e8-2a261b64b477","Type":"ContainerStarted","Data":"9e9e7052263bccce719a42cc8032ca51659ad05a0c8a5a35ec81979d7fa9081e"} Feb 01 07:10:51 crc kubenswrapper[4546]: I0201 07:10:51.411834 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" podStartSLOduration=1.901031527 podStartE2EDuration="2.411820106s" podCreationTimestamp="2026-02-01 07:10:49 +0000 UTC" firstStartedPulling="2026-02-01 07:10:50.285259185 +0000 UTC m=+1680.936195200" lastFinishedPulling="2026-02-01 07:10:50.796047763 +0000 UTC m=+1681.446983779" observedRunningTime="2026-02-01 07:10:51.401583835 +0000 UTC m=+1682.052519841" watchObservedRunningTime="2026-02-01 07:10:51.411820106 +0000 UTC m=+1682.062756122" Feb 01 07:10:56 crc kubenswrapper[4546]: I0201 07:10:56.612594 4546 scope.go:117] "RemoveContainer" containerID="704b1cf06fa9bd035f9f48c831d5894e0f89d194a217f6bffff62f09614f62ce" Feb 01 07:10:56 crc kubenswrapper[4546]: I0201 07:10:56.647367 4546 scope.go:117] "RemoveContainer" containerID="a5558d53565683dee85253ea038f83ba31a8b2ca401b23441f4b82fe20f040db" Feb 01 07:10:56 crc kubenswrapper[4546]: I0201 07:10:56.695122 4546 scope.go:117] "RemoveContainer" containerID="dbe98116b2536d914d6e7edcd3966feaffeb60826ac1a7318f5adcbb87b511b9" Feb 01 07:10:57 crc kubenswrapper[4546]: I0201 07:10:57.444379 4546 generic.go:334] "Generic (PLEG): container finished" podID="10bbfd2b-7d92-4730-a3e8-2a261b64b477" containerID="9e9e7052263bccce719a42cc8032ca51659ad05a0c8a5a35ec81979d7fa9081e" exitCode=0 Feb 01 07:10:57 crc kubenswrapper[4546]: I0201 07:10:57.444452 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" event={"ID":"10bbfd2b-7d92-4730-a3e8-2a261b64b477","Type":"ContainerDied","Data":"9e9e7052263bccce719a42cc8032ca51659ad05a0c8a5a35ec81979d7fa9081e"} Feb 01 07:10:57 crc kubenswrapper[4546]: I0201 07:10:57.655389 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:10:57 crc kubenswrapper[4546]: E0201 07:10:57.655779 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:10:58 crc kubenswrapper[4546]: I0201 07:10:58.830310 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:58 crc kubenswrapper[4546]: I0201 07:10:58.999009 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam\") pod \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " Feb 01 07:10:58 crc kubenswrapper[4546]: I0201 07:10:58.999450 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") pod \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " Feb 01 07:10:58 crc kubenswrapper[4546]: I0201 07:10:58.999956 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ght6d\" (UniqueName: \"kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d\") pod \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.006801 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d" (OuterVolumeSpecName: "kube-api-access-ght6d") pod "10bbfd2b-7d92-4730-a3e8-2a261b64b477" (UID: "10bbfd2b-7d92-4730-a3e8-2a261b64b477"). InnerVolumeSpecName "kube-api-access-ght6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:59 crc kubenswrapper[4546]: E0201 07:10:59.027289 4546 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory podName:10bbfd2b-7d92-4730-a3e8-2a261b64b477 nodeName:}" failed. No retries permitted until 2026-02-01 07:10:59.525309638 +0000 UTC m=+1690.176245654 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory") pod "10bbfd2b-7d92-4730-a3e8-2a261b64b477" (UID: "10bbfd2b-7d92-4730-a3e8-2a261b64b477") : error deleting /var/lib/kubelet/pods/10bbfd2b-7d92-4730-a3e8-2a261b64b477/volume-subpaths: remove /var/lib/kubelet/pods/10bbfd2b-7d92-4730-a3e8-2a261b64b477/volume-subpaths: no such file or directory Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.028039 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "10bbfd2b-7d92-4730-a3e8-2a261b64b477" (UID: "10bbfd2b-7d92-4730-a3e8-2a261b64b477"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.105367 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ght6d\" (UniqueName: \"kubernetes.io/projected/10bbfd2b-7d92-4730-a3e8-2a261b64b477-kube-api-access-ght6d\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.105408 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.466049 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" event={"ID":"10bbfd2b-7d92-4730-a3e8-2a261b64b477","Type":"ContainerDied","Data":"799daa635b4ee01d5e49acbe5892975911db013cd13594b02a870e8837b88d3e"} Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.466384 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799daa635b4ee01d5e49acbe5892975911db013cd13594b02a870e8837b88d3e" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.466097 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fnstp" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.567060 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd"] Feb 01 07:10:59 crc kubenswrapper[4546]: E0201 07:10:59.568337 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bbfd2b-7d92-4730-a3e8-2a261b64b477" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.568367 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bbfd2b-7d92-4730-a3e8-2a261b64b477" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.569114 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bbfd2b-7d92-4730-a3e8-2a261b64b477" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.595699 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd"] Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.595893 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.617178 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") pod \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\" (UID: \"10bbfd2b-7d92-4730-a3e8-2a261b64b477\") " Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.623637 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory" (OuterVolumeSpecName: "inventory") pod "10bbfd2b-7d92-4730-a3e8-2a261b64b477" (UID: "10bbfd2b-7d92-4730-a3e8-2a261b64b477"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.720730 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.721271 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhqxj\" (UniqueName: \"kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.721456 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.721985 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bbfd2b-7d92-4730-a3e8-2a261b64b477-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.824846 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.825005 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhqxj\" (UniqueName: \"kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.825079 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.834577 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.843674 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.863389 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhqxj\" (UniqueName: \"kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:10:59 crc kubenswrapper[4546]: I0201 07:10:59.915269 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:11:00 crc kubenswrapper[4546]: I0201 07:11:00.483570 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd"] Feb 01 07:11:01 crc kubenswrapper[4546]: I0201 07:11:01.491367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" event={"ID":"62c1cd63-26b0-44f6-b45f-229a13541859","Type":"ContainerStarted","Data":"ab4a0a890e3ddd28c2f1299b69eb9caeb3d00d2eb30d5a71a6d131fe321ba622"} Feb 01 07:11:01 crc kubenswrapper[4546]: I0201 07:11:01.491949 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" event={"ID":"62c1cd63-26b0-44f6-b45f-229a13541859","Type":"ContainerStarted","Data":"f2bc29cf1d3a1e70a7bbd76b23c565d5b3af259a2e9c3360c112698395a1162b"} Feb 01 07:11:01 crc kubenswrapper[4546]: I0201 07:11:01.510076 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" podStartSLOduration=1.899012655 podStartE2EDuration="2.510057113s" podCreationTimestamp="2026-02-01 07:10:59 +0000 UTC" firstStartedPulling="2026-02-01 07:11:00.490945826 +0000 UTC m=+1691.141881843" lastFinishedPulling="2026-02-01 07:11:01.101990285 +0000 UTC m=+1691.752926301" observedRunningTime="2026-02-01 07:11:01.506962919 +0000 UTC m=+1692.157898936" watchObservedRunningTime="2026-02-01 07:11:01.510057113 +0000 UTC m=+1692.160993130" Feb 01 07:11:08 crc kubenswrapper[4546]: I0201 07:11:08.564569 4546 generic.go:334] "Generic (PLEG): container finished" podID="62c1cd63-26b0-44f6-b45f-229a13541859" containerID="ab4a0a890e3ddd28c2f1299b69eb9caeb3d00d2eb30d5a71a6d131fe321ba622" exitCode=0 Feb 01 07:11:08 crc kubenswrapper[4546]: I0201 07:11:08.564668 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" event={"ID":"62c1cd63-26b0-44f6-b45f-229a13541859","Type":"ContainerDied","Data":"ab4a0a890e3ddd28c2f1299b69eb9caeb3d00d2eb30d5a71a6d131fe321ba622"} Feb 01 07:11:09 crc kubenswrapper[4546]: I0201 07:11:09.662137 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:11:09 crc kubenswrapper[4546]: E0201 07:11:09.662776 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:11:09 crc kubenswrapper[4546]: I0201 07:11:09.975942 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.088252 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam\") pod \"62c1cd63-26b0-44f6-b45f-229a13541859\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.088396 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory\") pod \"62c1cd63-26b0-44f6-b45f-229a13541859\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.088532 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhqxj\" (UniqueName: \"kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj\") pod \"62c1cd63-26b0-44f6-b45f-229a13541859\" (UID: \"62c1cd63-26b0-44f6-b45f-229a13541859\") " Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.095460 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj" (OuterVolumeSpecName: "kube-api-access-vhqxj") pod "62c1cd63-26b0-44f6-b45f-229a13541859" (UID: "62c1cd63-26b0-44f6-b45f-229a13541859"). InnerVolumeSpecName "kube-api-access-vhqxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.123938 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory" (OuterVolumeSpecName: "inventory") pod "62c1cd63-26b0-44f6-b45f-229a13541859" (UID: "62c1cd63-26b0-44f6-b45f-229a13541859"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.142622 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62c1cd63-26b0-44f6-b45f-229a13541859" (UID: "62c1cd63-26b0-44f6-b45f-229a13541859"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.192665 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhqxj\" (UniqueName: \"kubernetes.io/projected/62c1cd63-26b0-44f6-b45f-229a13541859-kube-api-access-vhqxj\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.192975 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.192994 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62c1cd63-26b0-44f6-b45f-229a13541859-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.589178 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" event={"ID":"62c1cd63-26b0-44f6-b45f-229a13541859","Type":"ContainerDied","Data":"f2bc29cf1d3a1e70a7bbd76b23c565d5b3af259a2e9c3360c112698395a1162b"} Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.589246 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2bc29cf1d3a1e70a7bbd76b23c565d5b3af259a2e9c3360c112698395a1162b" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.589292 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4wknd" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.689649 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975"] Feb 01 07:11:10 crc kubenswrapper[4546]: E0201 07:11:10.690181 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c1cd63-26b0-44f6-b45f-229a13541859" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.690198 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c1cd63-26b0-44f6-b45f-229a13541859" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.690441 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c1cd63-26b0-44f6-b45f-229a13541859" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.691283 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.695285 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.695957 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.698908 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.699053 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.699412 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.699481 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.699617 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.699800 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705016 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705076 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705130 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705205 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705232 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705252 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705319 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705413 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705478 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705514 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705576 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705635 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705673 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxg7\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.705774 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.707843 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975"] Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.809338 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.809957 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.810561 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.811360 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.811527 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.811748 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.813478 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.813642 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.813832 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.814081 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.814223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.814345 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.814449 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.814541 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxg7\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.816658 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.817128 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.817662 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.819451 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.819939 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.820029 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.820041 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.821391 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.822316 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.823085 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.824245 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.824348 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.824943 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:10 crc kubenswrapper[4546]: I0201 07:11:10.843567 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxg7\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tb975\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:11 crc kubenswrapper[4546]: I0201 07:11:11.013268 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:11 crc kubenswrapper[4546]: I0201 07:11:11.512353 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975"] Feb 01 07:11:11 crc kubenswrapper[4546]: I0201 07:11:11.600611 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" event={"ID":"6ceaf8ad-98f4-4371-b2c3-6764be02b013","Type":"ContainerStarted","Data":"97bca73f64d140da7a95c53e6da86a13e9a4c5e7ef8c6aa7741c824012ab9da7"} Feb 01 07:11:12 crc kubenswrapper[4546]: I0201 07:11:12.612588 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" event={"ID":"6ceaf8ad-98f4-4371-b2c3-6764be02b013","Type":"ContainerStarted","Data":"b7c50a8584f54ebc62620c3f738363af8f5b11dd2fab09b9d1243841f8a9971f"} Feb 01 07:11:12 crc kubenswrapper[4546]: I0201 07:11:12.644508 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" podStartSLOduration=2.135864396 podStartE2EDuration="2.644483533s" podCreationTimestamp="2026-02-01 07:11:10 +0000 UTC" firstStartedPulling="2026-02-01 07:11:11.518957974 +0000 UTC m=+1702.169893990" lastFinishedPulling="2026-02-01 07:11:12.027577112 +0000 UTC m=+1702.678513127" observedRunningTime="2026-02-01 07:11:12.637478946 +0000 UTC m=+1703.288414962" watchObservedRunningTime="2026-02-01 07:11:12.644483533 +0000 UTC m=+1703.295419549" Feb 01 07:11:23 crc kubenswrapper[4546]: I0201 07:11:23.655816 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:11:23 crc kubenswrapper[4546]: E0201 07:11:23.658400 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:11:34 crc kubenswrapper[4546]: I0201 07:11:34.655525 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:11:34 crc kubenswrapper[4546]: E0201 07:11:34.656555 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:11:39 crc kubenswrapper[4546]: I0201 07:11:39.916487 4546 generic.go:334] "Generic (PLEG): container finished" podID="6ceaf8ad-98f4-4371-b2c3-6764be02b013" containerID="b7c50a8584f54ebc62620c3f738363af8f5b11dd2fab09b9d1243841f8a9971f" exitCode=0 Feb 01 07:11:39 crc kubenswrapper[4546]: I0201 07:11:39.916569 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" event={"ID":"6ceaf8ad-98f4-4371-b2c3-6764be02b013","Type":"ContainerDied","Data":"b7c50a8584f54ebc62620c3f738363af8f5b11dd2fab09b9d1243841f8a9971f"} Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.296209 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328484 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328539 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rxg7\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328566 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328602 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328646 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328684 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328782 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328815 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328838 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328901 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.328983 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.338355 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.338530 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7" (OuterVolumeSpecName: "kube-api-access-5rxg7") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "kube-api-access-5rxg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.342111 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.342186 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.342296 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.343145 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.344534 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.349067 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.349717 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.362238 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory" (OuterVolumeSpecName: "inventory") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.364945 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.433208 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.433379 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.433476 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\" (UID: \"6ceaf8ad-98f4-4371-b2c3-6764be02b013\") " Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434828 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434851 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rxg7\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-kube-api-access-5rxg7\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434878 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434889 4546 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434902 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434914 4546 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434926 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434945 4546 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434953 4546 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434962 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.434974 4546 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.436176 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.436628 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.437920 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6ceaf8ad-98f4-4371-b2c3-6764be02b013" (UID: "6ceaf8ad-98f4-4371-b2c3-6764be02b013"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.538223 4546 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.538267 4546 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ceaf8ad-98f4-4371-b2c3-6764be02b013-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.538283 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6ceaf8ad-98f4-4371-b2c3-6764be02b013-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.940126 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" event={"ID":"6ceaf8ad-98f4-4371-b2c3-6764be02b013","Type":"ContainerDied","Data":"97bca73f64d140da7a95c53e6da86a13e9a4c5e7ef8c6aa7741c824012ab9da7"} Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.940469 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97bca73f64d140da7a95c53e6da86a13e9a4c5e7ef8c6aa7741c824012ab9da7" Feb 01 07:11:41 crc kubenswrapper[4546]: I0201 07:11:41.940202 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tb975" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.028656 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh"] Feb 01 07:11:42 crc kubenswrapper[4546]: E0201 07:11:42.029147 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ceaf8ad-98f4-4371-b2c3-6764be02b013" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.029167 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ceaf8ad-98f4-4371-b2c3-6764be02b013" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.029387 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ceaf8ad-98f4-4371-b2c3-6764be02b013" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.030091 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.035266 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.035404 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.035447 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.035558 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.035593 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.039134 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh"] Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.053385 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.053482 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.053633 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gf8x\" (UniqueName: \"kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.053746 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.053798 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.156142 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.156225 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.156345 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gf8x\" (UniqueName: \"kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.156444 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.156490 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.157590 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.160154 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.160693 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.161142 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.173803 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gf8x\" (UniqueName: \"kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-vcmhh\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.366983 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.881568 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh"] Feb 01 07:11:42 crc kubenswrapper[4546]: W0201 07:11:42.883145 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b0618fd_e308_4b6a_ba9b_ec5657196bb8.slice/crio-e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2 WatchSource:0}: Error finding container e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2: Status 404 returned error can't find the container with id e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2 Feb 01 07:11:42 crc kubenswrapper[4546]: I0201 07:11:42.949179 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" event={"ID":"0b0618fd-e308-4b6a-ba9b-ec5657196bb8","Type":"ContainerStarted","Data":"e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2"} Feb 01 07:11:43 crc kubenswrapper[4546]: I0201 07:11:43.959468 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" event={"ID":"0b0618fd-e308-4b6a-ba9b-ec5657196bb8","Type":"ContainerStarted","Data":"6a3e2dd72ad652695fb18de5663f0c6ce1b4f677bacb34a8444035999e15ce77"} Feb 01 07:11:43 crc kubenswrapper[4546]: I0201 07:11:43.978665 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" podStartSLOduration=1.472120094 podStartE2EDuration="1.978646235s" podCreationTimestamp="2026-02-01 07:11:42 +0000 UTC" firstStartedPulling="2026-02-01 07:11:42.885298407 +0000 UTC m=+1733.536234413" lastFinishedPulling="2026-02-01 07:11:43.391824539 +0000 UTC m=+1734.042760554" observedRunningTime="2026-02-01 07:11:43.973889728 +0000 UTC m=+1734.624825744" watchObservedRunningTime="2026-02-01 07:11:43.978646235 +0000 UTC m=+1734.629582252" Feb 01 07:11:48 crc kubenswrapper[4546]: I0201 07:11:48.655096 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:11:48 crc kubenswrapper[4546]: E0201 07:11:48.656219 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:12:03 crc kubenswrapper[4546]: I0201 07:12:03.656112 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:12:03 crc kubenswrapper[4546]: E0201 07:12:03.657732 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:12:17 crc kubenswrapper[4546]: I0201 07:12:17.656178 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:12:17 crc kubenswrapper[4546]: E0201 07:12:17.657219 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:12:32 crc kubenswrapper[4546]: I0201 07:12:32.657920 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:12:32 crc kubenswrapper[4546]: E0201 07:12:32.660033 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:12:33 crc kubenswrapper[4546]: I0201 07:12:33.464288 4546 generic.go:334] "Generic (PLEG): container finished" podID="0b0618fd-e308-4b6a-ba9b-ec5657196bb8" containerID="6a3e2dd72ad652695fb18de5663f0c6ce1b4f677bacb34a8444035999e15ce77" exitCode=0 Feb 01 07:12:33 crc kubenswrapper[4546]: I0201 07:12:33.464338 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" event={"ID":"0b0618fd-e308-4b6a-ba9b-ec5657196bb8","Type":"ContainerDied","Data":"6a3e2dd72ad652695fb18de5663f0c6ce1b4f677bacb34a8444035999e15ce77"} Feb 01 07:12:34 crc kubenswrapper[4546]: I0201 07:12:34.840247 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.010928 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle\") pod \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.010982 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gf8x\" (UniqueName: \"kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x\") pod \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.011037 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam\") pod \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.011074 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0\") pod \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.011134 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory\") pod \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\" (UID: \"0b0618fd-e308-4b6a-ba9b-ec5657196bb8\") " Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.031248 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x" (OuterVolumeSpecName: "kube-api-access-9gf8x") pod "0b0618fd-e308-4b6a-ba9b-ec5657196bb8" (UID: "0b0618fd-e308-4b6a-ba9b-ec5657196bb8"). InnerVolumeSpecName "kube-api-access-9gf8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.032720 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0b0618fd-e308-4b6a-ba9b-ec5657196bb8" (UID: "0b0618fd-e308-4b6a-ba9b-ec5657196bb8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.051816 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory" (OuterVolumeSpecName: "inventory") pod "0b0618fd-e308-4b6a-ba9b-ec5657196bb8" (UID: "0b0618fd-e308-4b6a-ba9b-ec5657196bb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.056572 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0b0618fd-e308-4b6a-ba9b-ec5657196bb8" (UID: "0b0618fd-e308-4b6a-ba9b-ec5657196bb8"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.061008 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b0618fd-e308-4b6a-ba9b-ec5657196bb8" (UID: "0b0618fd-e308-4b6a-ba9b-ec5657196bb8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.114636 4546 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.114663 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gf8x\" (UniqueName: \"kubernetes.io/projected/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-kube-api-access-9gf8x\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.114677 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.114686 4546 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.114697 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b0618fd-e308-4b6a-ba9b-ec5657196bb8-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.484452 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" event={"ID":"0b0618fd-e308-4b6a-ba9b-ec5657196bb8","Type":"ContainerDied","Data":"e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2"} Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.484491 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9ef88c1f26b62745f6c4061d9d0248da24069ab0febdc60ab073e8255319ba2" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.484527 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-vcmhh" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.583087 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd"] Feb 01 07:12:35 crc kubenswrapper[4546]: E0201 07:12:35.583497 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0618fd-e308-4b6a-ba9b-ec5657196bb8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.583516 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0618fd-e308-4b6a-ba9b-ec5657196bb8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.583763 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0618fd-e308-4b6a-ba9b-ec5657196bb8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.584398 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.589275 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.589666 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.590354 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.590736 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.591123 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.591138 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.592426 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd"] Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.627490 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86ld9\" (UniqueName: \"kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.627634 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.627676 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.627716 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.627882 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.628033 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730385 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730429 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730474 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730547 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730593 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.730821 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86ld9\" (UniqueName: \"kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.736545 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.736897 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.739367 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.739615 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.741361 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.756829 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86ld9\" (UniqueName: \"kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:35 crc kubenswrapper[4546]: I0201 07:12:35.899191 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:12:36 crc kubenswrapper[4546]: I0201 07:12:36.225515 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd"] Feb 01 07:12:36 crc kubenswrapper[4546]: I0201 07:12:36.495229 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" event={"ID":"548edaca-19fb-4613-972e-430393166485","Type":"ContainerStarted","Data":"aeda06b86e6d1c27502252617c32a836675c76255f598e3b8ad9087007146ebf"} Feb 01 07:12:37 crc kubenswrapper[4546]: I0201 07:12:37.507006 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" event={"ID":"548edaca-19fb-4613-972e-430393166485","Type":"ContainerStarted","Data":"e672ce48a153609d09ed2e40c444e78ae350c4fb8cacbdb4e9926bc9a74d86bf"} Feb 01 07:12:37 crc kubenswrapper[4546]: I0201 07:12:37.525520 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" podStartSLOduration=2.036736416 podStartE2EDuration="2.525489727s" podCreationTimestamp="2026-02-01 07:12:35 +0000 UTC" firstStartedPulling="2026-02-01 07:12:36.222239834 +0000 UTC m=+1786.873175850" lastFinishedPulling="2026-02-01 07:12:36.710993145 +0000 UTC m=+1787.361929161" observedRunningTime="2026-02-01 07:12:37.520111787 +0000 UTC m=+1788.171047803" watchObservedRunningTime="2026-02-01 07:12:37.525489727 +0000 UTC m=+1788.176425744" Feb 01 07:12:46 crc kubenswrapper[4546]: I0201 07:12:46.655207 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:12:46 crc kubenswrapper[4546]: E0201 07:12:46.656337 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:13:01 crc kubenswrapper[4546]: I0201 07:13:01.655418 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:13:02 crc kubenswrapper[4546]: I0201 07:13:02.764909 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319"} Feb 01 07:13:14 crc kubenswrapper[4546]: I0201 07:13:14.882687 4546 generic.go:334] "Generic (PLEG): container finished" podID="548edaca-19fb-4613-972e-430393166485" containerID="e672ce48a153609d09ed2e40c444e78ae350c4fb8cacbdb4e9926bc9a74d86bf" exitCode=0 Feb 01 07:13:14 crc kubenswrapper[4546]: I0201 07:13:14.883314 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" event={"ID":"548edaca-19fb-4613-972e-430393166485","Type":"ContainerDied","Data":"e672ce48a153609d09ed2e40c444e78ae350c4fb8cacbdb4e9926bc9a74d86bf"} Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.300888 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.411940 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.412137 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86ld9\" (UniqueName: \"kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.412169 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.412279 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.412913 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.413214 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0\") pod \"548edaca-19fb-4613-972e-430393166485\" (UID: \"548edaca-19fb-4613-972e-430393166485\") " Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.433297 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.441130 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9" (OuterVolumeSpecName: "kube-api-access-86ld9") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "kube-api-access-86ld9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.448486 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.448972 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory" (OuterVolumeSpecName: "inventory") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.450895 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.451299 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "548edaca-19fb-4613-972e-430393166485" (UID: "548edaca-19fb-4613-972e-430393166485"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.517999 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.518033 4546 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.518045 4546 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.518057 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.518068 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86ld9\" (UniqueName: \"kubernetes.io/projected/548edaca-19fb-4613-972e-430393166485-kube-api-access-86ld9\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.518079 4546 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548edaca-19fb-4613-972e-430393166485-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.906961 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" event={"ID":"548edaca-19fb-4613-972e-430393166485","Type":"ContainerDied","Data":"aeda06b86e6d1c27502252617c32a836675c76255f598e3b8ad9087007146ebf"} Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.907013 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeda06b86e6d1c27502252617c32a836675c76255f598e3b8ad9087007146ebf" Feb 01 07:13:16 crc kubenswrapper[4546]: I0201 07:13:16.907014 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7xqgd" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.018710 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b"] Feb 01 07:13:17 crc kubenswrapper[4546]: E0201 07:13:17.019121 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548edaca-19fb-4613-972e-430393166485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.019141 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="548edaca-19fb-4613-972e-430393166485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.019328 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="548edaca-19fb-4613-972e-430393166485" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.019948 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.024410 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.024455 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.024611 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.024784 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.025260 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.036181 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b"] Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.130423 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.130589 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.130624 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbwfl\" (UniqueName: \"kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.130674 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.130871 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.232234 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.232299 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.232394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.232418 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbwfl\" (UniqueName: \"kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.232450 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.236443 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.236458 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.237259 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.246468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.248114 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbwfl\" (UniqueName: \"kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.338401 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.835561 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b"] Feb 01 07:13:17 crc kubenswrapper[4546]: I0201 07:13:17.915870 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" event={"ID":"e9657e9f-b182-4053-8b91-2769b0da218c","Type":"ContainerStarted","Data":"2c7f27849b427dd81f40f31dae06185ddde8b503ecbb65692f7102e2f3d2e4ba"} Feb 01 07:13:18 crc kubenswrapper[4546]: I0201 07:13:18.927544 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" event={"ID":"e9657e9f-b182-4053-8b91-2769b0da218c","Type":"ContainerStarted","Data":"b2e724e66545a6f99b37e613a0c5003acc6568eee66988b5dd93afa59a5a1bd2"} Feb 01 07:13:18 crc kubenswrapper[4546]: I0201 07:13:18.948937 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" podStartSLOduration=2.436024279 podStartE2EDuration="2.948893782s" podCreationTimestamp="2026-02-01 07:13:16 +0000 UTC" firstStartedPulling="2026-02-01 07:13:17.839188852 +0000 UTC m=+1828.490124867" lastFinishedPulling="2026-02-01 07:13:18.352058354 +0000 UTC m=+1829.002994370" observedRunningTime="2026-02-01 07:13:18.947018726 +0000 UTC m=+1829.597954742" watchObservedRunningTime="2026-02-01 07:13:18.948893782 +0000 UTC m=+1829.599829798" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.148817 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk"] Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.150928 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.152953 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.156604 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.160567 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk"] Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.263234 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.263343 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svqdw\" (UniqueName: \"kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.263523 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.367441 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.367503 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svqdw\" (UniqueName: \"kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.367598 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.368632 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.374834 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.387787 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svqdw\" (UniqueName: \"kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw\") pod \"collect-profiles-29498835-56thk\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.475612 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:00 crc kubenswrapper[4546]: I0201 07:15:00.910945 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk"] Feb 01 07:15:01 crc kubenswrapper[4546]: I0201 07:15:01.925752 4546 generic.go:334] "Generic (PLEG): container finished" podID="f86b3853-383a-4ffa-8256-b7c5ec09e580" containerID="30b71ac108a05944748368aa998a9b332d5d58bfdcb0c8f6536af2877f9477b5" exitCode=0 Feb 01 07:15:01 crc kubenswrapper[4546]: I0201 07:15:01.926125 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" event={"ID":"f86b3853-383a-4ffa-8256-b7c5ec09e580","Type":"ContainerDied","Data":"30b71ac108a05944748368aa998a9b332d5d58bfdcb0c8f6536af2877f9477b5"} Feb 01 07:15:01 crc kubenswrapper[4546]: I0201 07:15:01.926169 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" event={"ID":"f86b3853-383a-4ffa-8256-b7c5ec09e580","Type":"ContainerStarted","Data":"d64a78028e26508aba694bab617d4f6446bfe90c5863d2c9bbf417ebc08d3b34"} Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.210346 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.335487 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume\") pod \"f86b3853-383a-4ffa-8256-b7c5ec09e580\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.335631 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svqdw\" (UniqueName: \"kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw\") pod \"f86b3853-383a-4ffa-8256-b7c5ec09e580\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.335791 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume\") pod \"f86b3853-383a-4ffa-8256-b7c5ec09e580\" (UID: \"f86b3853-383a-4ffa-8256-b7c5ec09e580\") " Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.336298 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume" (OuterVolumeSpecName: "config-volume") pod "f86b3853-383a-4ffa-8256-b7c5ec09e580" (UID: "f86b3853-383a-4ffa-8256-b7c5ec09e580"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.337196 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f86b3853-383a-4ffa-8256-b7c5ec09e580-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.342783 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw" (OuterVolumeSpecName: "kube-api-access-svqdw") pod "f86b3853-383a-4ffa-8256-b7c5ec09e580" (UID: "f86b3853-383a-4ffa-8256-b7c5ec09e580"). InnerVolumeSpecName "kube-api-access-svqdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.352157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f86b3853-383a-4ffa-8256-b7c5ec09e580" (UID: "f86b3853-383a-4ffa-8256-b7c5ec09e580"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.439114 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svqdw\" (UniqueName: \"kubernetes.io/projected/f86b3853-383a-4ffa-8256-b7c5ec09e580-kube-api-access-svqdw\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.439263 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f86b3853-383a-4ffa-8256-b7c5ec09e580-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.946961 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" event={"ID":"f86b3853-383a-4ffa-8256-b7c5ec09e580","Type":"ContainerDied","Data":"d64a78028e26508aba694bab617d4f6446bfe90c5863d2c9bbf417ebc08d3b34"} Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.947254 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d64a78028e26508aba694bab617d4f6446bfe90c5863d2c9bbf417ebc08d3b34" Feb 01 07:15:03 crc kubenswrapper[4546]: I0201 07:15:03.947016 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk" Feb 01 07:15:04 crc kubenswrapper[4546]: I0201 07:15:04.291043 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm"] Feb 01 07:15:04 crc kubenswrapper[4546]: I0201 07:15:04.299391 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498790-bxbrm"] Feb 01 07:15:05 crc kubenswrapper[4546]: I0201 07:15:05.663996 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0904ae3e-72bf-4b72-9c6b-734d840b9cf5" path="/var/lib/kubelet/pods/0904ae3e-72bf-4b72-9c6b-734d840b9cf5/volumes" Feb 01 07:15:25 crc kubenswrapper[4546]: I0201 07:15:25.421500 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:15:25 crc kubenswrapper[4546]: I0201 07:15:25.422260 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:15:55 crc kubenswrapper[4546]: I0201 07:15:55.420718 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:15:55 crc kubenswrapper[4546]: I0201 07:15:55.422126 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:15:56 crc kubenswrapper[4546]: I0201 07:15:56.889966 4546 scope.go:117] "RemoveContainer" containerID="30fd37b455f8af9eb03ce8fc4accd607a66296af0632af56c6f814f570eacd0f" Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.420715 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.421263 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.421325 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.422266 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.422318 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319" gracePeriod=600 Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.745459 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319" exitCode=0 Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.745695 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319"} Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.745729 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65"} Feb 01 07:16:25 crc kubenswrapper[4546]: I0201 07:16:25.745747 4546 scope.go:117] "RemoveContainer" containerID="67150390639ede1718dabc5b83cc5517463ff588775cd05db11d596afe6d925f" Feb 01 07:16:29 crc kubenswrapper[4546]: I0201 07:16:29.785445 4546 generic.go:334] "Generic (PLEG): container finished" podID="e9657e9f-b182-4053-8b91-2769b0da218c" containerID="b2e724e66545a6f99b37e613a0c5003acc6568eee66988b5dd93afa59a5a1bd2" exitCode=0 Feb 01 07:16:29 crc kubenswrapper[4546]: I0201 07:16:29.785519 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" event={"ID":"e9657e9f-b182-4053-8b91-2769b0da218c","Type":"ContainerDied","Data":"b2e724e66545a6f99b37e613a0c5003acc6568eee66988b5dd93afa59a5a1bd2"} Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.170954 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.229161 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam\") pod \"e9657e9f-b182-4053-8b91-2769b0da218c\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.229381 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0\") pod \"e9657e9f-b182-4053-8b91-2769b0da218c\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.229419 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory\") pod \"e9657e9f-b182-4053-8b91-2769b0da218c\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.229550 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbwfl\" (UniqueName: \"kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl\") pod \"e9657e9f-b182-4053-8b91-2769b0da218c\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.229628 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle\") pod \"e9657e9f-b182-4053-8b91-2769b0da218c\" (UID: \"e9657e9f-b182-4053-8b91-2769b0da218c\") " Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.235918 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl" (OuterVolumeSpecName: "kube-api-access-mbwfl") pod "e9657e9f-b182-4053-8b91-2769b0da218c" (UID: "e9657e9f-b182-4053-8b91-2769b0da218c"). InnerVolumeSpecName "kube-api-access-mbwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.245962 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e9657e9f-b182-4053-8b91-2769b0da218c" (UID: "e9657e9f-b182-4053-8b91-2769b0da218c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.258355 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9657e9f-b182-4053-8b91-2769b0da218c" (UID: "e9657e9f-b182-4053-8b91-2769b0da218c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.258503 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory" (OuterVolumeSpecName: "inventory") pod "e9657e9f-b182-4053-8b91-2769b0da218c" (UID: "e9657e9f-b182-4053-8b91-2769b0da218c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.260075 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e9657e9f-b182-4053-8b91-2769b0da218c" (UID: "e9657e9f-b182-4053-8b91-2769b0da218c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.333384 4546 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.333440 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.333456 4546 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.333472 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9657e9f-b182-4053-8b91-2769b0da218c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.333487 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbwfl\" (UniqueName: \"kubernetes.io/projected/e9657e9f-b182-4053-8b91-2769b0da218c-kube-api-access-mbwfl\") on node \"crc\" DevicePath \"\"" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.811482 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" event={"ID":"e9657e9f-b182-4053-8b91-2769b0da218c","Type":"ContainerDied","Data":"2c7f27849b427dd81f40f31dae06185ddde8b503ecbb65692f7102e2f3d2e4ba"} Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.811530 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7f27849b427dd81f40f31dae06185ddde8b503ecbb65692f7102e2f3d2e4ba" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.811587 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-nrr2b" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.969990 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s"] Feb 01 07:16:31 crc kubenswrapper[4546]: E0201 07:16:31.970534 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9657e9f-b182-4053-8b91-2769b0da218c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.970557 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9657e9f-b182-4053-8b91-2769b0da218c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 01 07:16:31 crc kubenswrapper[4546]: E0201 07:16:31.970578 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86b3853-383a-4ffa-8256-b7c5ec09e580" containerName="collect-profiles" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.970585 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86b3853-383a-4ffa-8256-b7c5ec09e580" containerName="collect-profiles" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.970786 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9657e9f-b182-4053-8b91-2769b0da218c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.970814 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86b3853-383a-4ffa-8256-b7c5ec09e580" containerName="collect-profiles" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.971630 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979503 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979666 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979774 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979850 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979908 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.979937 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.981226 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:16:31 crc kubenswrapper[4546]: I0201 07:16:31.983666 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s"] Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.050636 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.050835 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.050933 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051030 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051133 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktmg\" (UniqueName: \"kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051238 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051319 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051535 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.051623 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.153941 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.154200 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.154316 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.154407 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.154531 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.154971 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktmg\" (UniqueName: \"kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.155089 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.155162 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.155323 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.159742 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.160215 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.160716 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.160788 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.161392 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.162334 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.165609 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.166180 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.170109 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktmg\" (UniqueName: \"kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m672s\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.287683 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.788807 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s"] Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.793697 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:16:32 crc kubenswrapper[4546]: I0201 07:16:32.819690 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" event={"ID":"f292e70b-5ea8-4991-9cea-c4c869f6ebc1","Type":"ContainerStarted","Data":"2708fdeb3e2698e9091066ae2357ec56f303702968b44a6af3c5e0518d77ed5f"} Feb 01 07:16:33 crc kubenswrapper[4546]: I0201 07:16:33.830061 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" event={"ID":"f292e70b-5ea8-4991-9cea-c4c869f6ebc1","Type":"ContainerStarted","Data":"02b6d5cbd8e5c57a1091e8eeb5f755473254747ca868959c2fbb06ab3377e782"} Feb 01 07:16:33 crc kubenswrapper[4546]: I0201 07:16:33.849634 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" podStartSLOduration=2.2643075169999998 podStartE2EDuration="2.849612775s" podCreationTimestamp="2026-02-01 07:16:31 +0000 UTC" firstStartedPulling="2026-02-01 07:16:32.7934947 +0000 UTC m=+2023.444430715" lastFinishedPulling="2026-02-01 07:16:33.378799957 +0000 UTC m=+2024.029735973" observedRunningTime="2026-02-01 07:16:33.846063092 +0000 UTC m=+2024.496999108" watchObservedRunningTime="2026-02-01 07:16:33.849612775 +0000 UTC m=+2024.500548791" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.261930 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.269045 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.271088 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.429949 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfmm9\" (UniqueName: \"kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.430010 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.430307 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.531756 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfmm9\" (UniqueName: \"kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.531807 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.531893 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.532346 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.532561 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.552323 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfmm9\" (UniqueName: \"kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9\") pod \"certified-operators-lq9v6\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.588899 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:19 crc kubenswrapper[4546]: I0201 07:17:19.986582 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:20 crc kubenswrapper[4546]: I0201 07:17:20.315460 4546 generic.go:334] "Generic (PLEG): container finished" podID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerID="61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a" exitCode=0 Feb 01 07:17:20 crc kubenswrapper[4546]: I0201 07:17:20.315733 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerDied","Data":"61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a"} Feb 01 07:17:20 crc kubenswrapper[4546]: I0201 07:17:20.315762 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerStarted","Data":"7fae9fb7e723cbef3e87d46ca9a3ff6525a07133e458e95d5d3c4d017df4e7a0"} Feb 01 07:17:21 crc kubenswrapper[4546]: I0201 07:17:21.328222 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerStarted","Data":"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4"} Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.252081 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.263752 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.331900 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.344028 4546 generic.go:334] "Generic (PLEG): container finished" podID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerID="d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4" exitCode=0 Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.344102 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerDied","Data":"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4"} Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.407053 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.407698 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzn5k\" (UniqueName: \"kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.408048 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.510845 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzn5k\" (UniqueName: \"kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.511234 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.511731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.512217 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.512263 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.545695 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzn5k\" (UniqueName: \"kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k\") pod \"redhat-marketplace-rp5n2\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:22 crc kubenswrapper[4546]: I0201 07:17:22.601403 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:23 crc kubenswrapper[4546]: I0201 07:17:23.103555 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:23 crc kubenswrapper[4546]: I0201 07:17:23.359632 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerStarted","Data":"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1"} Feb 01 07:17:23 crc kubenswrapper[4546]: I0201 07:17:23.362414 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerStarted","Data":"60305d786ccbae6f8f6e1a7cc4c1b5d53ad86517bcdac986e33689fd0dad01e2"} Feb 01 07:17:23 crc kubenswrapper[4546]: I0201 07:17:23.362537 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerStarted","Data":"bccb02696d0c9ae4703846b28d340f6e0c23e3ad3120e0e28b53203a7a30dc80"} Feb 01 07:17:23 crc kubenswrapper[4546]: I0201 07:17:23.388677 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lq9v6" podStartSLOduration=1.878770107 podStartE2EDuration="4.388654492s" podCreationTimestamp="2026-02-01 07:17:19 +0000 UTC" firstStartedPulling="2026-02-01 07:17:20.322469414 +0000 UTC m=+2070.973405430" lastFinishedPulling="2026-02-01 07:17:22.832353799 +0000 UTC m=+2073.483289815" observedRunningTime="2026-02-01 07:17:23.381364648 +0000 UTC m=+2074.032300665" watchObservedRunningTime="2026-02-01 07:17:23.388654492 +0000 UTC m=+2074.039590508" Feb 01 07:17:24 crc kubenswrapper[4546]: I0201 07:17:24.383316 4546 generic.go:334] "Generic (PLEG): container finished" podID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerID="60305d786ccbae6f8f6e1a7cc4c1b5d53ad86517bcdac986e33689fd0dad01e2" exitCode=0 Feb 01 07:17:24 crc kubenswrapper[4546]: I0201 07:17:24.383411 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerDied","Data":"60305d786ccbae6f8f6e1a7cc4c1b5d53ad86517bcdac986e33689fd0dad01e2"} Feb 01 07:17:25 crc kubenswrapper[4546]: I0201 07:17:25.397551 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerStarted","Data":"87030e1218cea1ce650becd89e0b27726f71d84e2d2f0c160505ef16bf743850"} Feb 01 07:17:26 crc kubenswrapper[4546]: I0201 07:17:26.420483 4546 generic.go:334] "Generic (PLEG): container finished" podID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerID="87030e1218cea1ce650becd89e0b27726f71d84e2d2f0c160505ef16bf743850" exitCode=0 Feb 01 07:17:26 crc kubenswrapper[4546]: I0201 07:17:26.420839 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerDied","Data":"87030e1218cea1ce650becd89e0b27726f71d84e2d2f0c160505ef16bf743850"} Feb 01 07:17:27 crc kubenswrapper[4546]: I0201 07:17:27.432124 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerStarted","Data":"883060e5e5324e56c0ed2e21de13b5d08b457a1ba8ab17b7a6d0ab6d6822af19"} Feb 01 07:17:27 crc kubenswrapper[4546]: I0201 07:17:27.455489 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rp5n2" podStartSLOduration=2.928649014 podStartE2EDuration="5.455465181s" podCreationTimestamp="2026-02-01 07:17:22 +0000 UTC" firstStartedPulling="2026-02-01 07:17:24.385016265 +0000 UTC m=+2075.035952281" lastFinishedPulling="2026-02-01 07:17:26.911832432 +0000 UTC m=+2077.562768448" observedRunningTime="2026-02-01 07:17:27.447283546 +0000 UTC m=+2078.098219562" watchObservedRunningTime="2026-02-01 07:17:27.455465181 +0000 UTC m=+2078.106401198" Feb 01 07:17:29 crc kubenswrapper[4546]: I0201 07:17:29.589635 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:29 crc kubenswrapper[4546]: I0201 07:17:29.590459 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:29 crc kubenswrapper[4546]: I0201 07:17:29.630413 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:30 crc kubenswrapper[4546]: I0201 07:17:30.514821 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:30 crc kubenswrapper[4546]: I0201 07:17:30.845701 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:32 crc kubenswrapper[4546]: I0201 07:17:32.495584 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lq9v6" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="registry-server" containerID="cri-o://90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1" gracePeriod=2 Feb 01 07:17:32 crc kubenswrapper[4546]: I0201 07:17:32.601876 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:32 crc kubenswrapper[4546]: I0201 07:17:32.601948 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:32 crc kubenswrapper[4546]: I0201 07:17:32.740729 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.175711 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.300665 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfmm9\" (UniqueName: \"kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9\") pod \"a78de406-ec23-4235-8e93-cfdd88cd394d\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.300959 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content\") pod \"a78de406-ec23-4235-8e93-cfdd88cd394d\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.303036 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities\") pod \"a78de406-ec23-4235-8e93-cfdd88cd394d\" (UID: \"a78de406-ec23-4235-8e93-cfdd88cd394d\") " Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.303620 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities" (OuterVolumeSpecName: "utilities") pod "a78de406-ec23-4235-8e93-cfdd88cd394d" (UID: "a78de406-ec23-4235-8e93-cfdd88cd394d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.303998 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.307612 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9" (OuterVolumeSpecName: "kube-api-access-zfmm9") pod "a78de406-ec23-4235-8e93-cfdd88cd394d" (UID: "a78de406-ec23-4235-8e93-cfdd88cd394d"). InnerVolumeSpecName "kube-api-access-zfmm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.344309 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a78de406-ec23-4235-8e93-cfdd88cd394d" (UID: "a78de406-ec23-4235-8e93-cfdd88cd394d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.405028 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfmm9\" (UniqueName: \"kubernetes.io/projected/a78de406-ec23-4235-8e93-cfdd88cd394d-kube-api-access-zfmm9\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.405092 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78de406-ec23-4235-8e93-cfdd88cd394d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.512940 4546 generic.go:334] "Generic (PLEG): container finished" podID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerID="90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1" exitCode=0 Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.513049 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lq9v6" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.513050 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerDied","Data":"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1"} Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.513119 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lq9v6" event={"ID":"a78de406-ec23-4235-8e93-cfdd88cd394d","Type":"ContainerDied","Data":"7fae9fb7e723cbef3e87d46ca9a3ff6525a07133e458e95d5d3c4d017df4e7a0"} Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.513152 4546 scope.go:117] "RemoveContainer" containerID="90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.545987 4546 scope.go:117] "RemoveContainer" containerID="d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.565567 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.578736 4546 scope.go:117] "RemoveContainer" containerID="61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.579630 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lq9v6"] Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.583801 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.634070 4546 scope.go:117] "RemoveContainer" containerID="90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1" Feb 01 07:17:33 crc kubenswrapper[4546]: E0201 07:17:33.634567 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1\": container with ID starting with 90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1 not found: ID does not exist" containerID="90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.634612 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1"} err="failed to get container status \"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1\": rpc error: code = NotFound desc = could not find container \"90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1\": container with ID starting with 90aa60e635dd641a85b08bf961b6bcc0b5638fb4a20d2a072c1cfc0bef7a04f1 not found: ID does not exist" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.634643 4546 scope.go:117] "RemoveContainer" containerID="d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4" Feb 01 07:17:33 crc kubenswrapper[4546]: E0201 07:17:33.635147 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4\": container with ID starting with d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4 not found: ID does not exist" containerID="d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.635182 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4"} err="failed to get container status \"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4\": rpc error: code = NotFound desc = could not find container \"d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4\": container with ID starting with d816ecd33f46dfbebe5f25b074d3bcb43f6b160e8437105cc45273b8367963c4 not found: ID does not exist" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.635208 4546 scope.go:117] "RemoveContainer" containerID="61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a" Feb 01 07:17:33 crc kubenswrapper[4546]: E0201 07:17:33.635594 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a\": container with ID starting with 61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a not found: ID does not exist" containerID="61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.635616 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a"} err="failed to get container status \"61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a\": rpc error: code = NotFound desc = could not find container \"61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a\": container with ID starting with 61a53433ecafa3fe38d844cc122d0133305ea74c987c16711a7999c30ecf696a not found: ID does not exist" Feb 01 07:17:33 crc kubenswrapper[4546]: I0201 07:17:33.665820 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" path="/var/lib/kubelet/pods/a78de406-ec23-4235-8e93-cfdd88cd394d/volumes" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.242011 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.242730 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rp5n2" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="registry-server" containerID="cri-o://883060e5e5324e56c0ed2e21de13b5d08b457a1ba8ab17b7a6d0ab6d6822af19" gracePeriod=2 Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.623916 4546 generic.go:334] "Generic (PLEG): container finished" podID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerID="883060e5e5324e56c0ed2e21de13b5d08b457a1ba8ab17b7a6d0ab6d6822af19" exitCode=0 Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.623975 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerDied","Data":"883060e5e5324e56c0ed2e21de13b5d08b457a1ba8ab17b7a6d0ab6d6822af19"} Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.719494 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.855430 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzn5k\" (UniqueName: \"kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k\") pod \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.855656 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content\") pod \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.855883 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities\") pod \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\" (UID: \"2ab1e0c2-e025-422e-8523-96e4b51c00a0\") " Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.858904 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities" (OuterVolumeSpecName: "utilities") pod "2ab1e0c2-e025-422e-8523-96e4b51c00a0" (UID: "2ab1e0c2-e025-422e-8523-96e4b51c00a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.863291 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k" (OuterVolumeSpecName: "kube-api-access-zzn5k") pod "2ab1e0c2-e025-422e-8523-96e4b51c00a0" (UID: "2ab1e0c2-e025-422e-8523-96e4b51c00a0"). InnerVolumeSpecName "kube-api-access-zzn5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.884616 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ab1e0c2-e025-422e-8523-96e4b51c00a0" (UID: "2ab1e0c2-e025-422e-8523-96e4b51c00a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.959958 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.960100 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzn5k\" (UniqueName: \"kubernetes.io/projected/2ab1e0c2-e025-422e-8523-96e4b51c00a0-kube-api-access-zzn5k\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:38 crc kubenswrapper[4546]: I0201 07:17:38.960165 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab1e0c2-e025-422e-8523-96e4b51c00a0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.637839 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rp5n2" event={"ID":"2ab1e0c2-e025-422e-8523-96e4b51c00a0","Type":"ContainerDied","Data":"bccb02696d0c9ae4703846b28d340f6e0c23e3ad3120e0e28b53203a7a30dc80"} Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.639333 4546 scope.go:117] "RemoveContainer" containerID="883060e5e5324e56c0ed2e21de13b5d08b457a1ba8ab17b7a6d0ab6d6822af19" Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.638135 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rp5n2" Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.668275 4546 scope.go:117] "RemoveContainer" containerID="87030e1218cea1ce650becd89e0b27726f71d84e2d2f0c160505ef16bf743850" Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.687125 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.698471 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rp5n2"] Feb 01 07:17:39 crc kubenswrapper[4546]: I0201 07:17:39.699664 4546 scope.go:117] "RemoveContainer" containerID="60305d786ccbae6f8f6e1a7cc4c1b5d53ad86517bcdac986e33689fd0dad01e2" Feb 01 07:17:41 crc kubenswrapper[4546]: I0201 07:17:41.664402 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" path="/var/lib/kubelet/pods/2ab1e0c2-e025-422e-8523-96e4b51c00a0/volumes" Feb 01 07:18:22 crc kubenswrapper[4546]: I0201 07:18:22.075536 4546 generic.go:334] "Generic (PLEG): container finished" podID="f292e70b-5ea8-4991-9cea-c4c869f6ebc1" containerID="02b6d5cbd8e5c57a1091e8eeb5f755473254747ca868959c2fbb06ab3377e782" exitCode=0 Feb 01 07:18:22 crc kubenswrapper[4546]: I0201 07:18:22.075637 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" event={"ID":"f292e70b-5ea8-4991-9cea-c4c869f6ebc1","Type":"ContainerDied","Data":"02b6d5cbd8e5c57a1091e8eeb5f755473254747ca868959c2fbb06ab3377e782"} Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.556869 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668316 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktmg\" (UniqueName: \"kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668362 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668397 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668489 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668672 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668712 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668781 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668829 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.668934 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1\") pod \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\" (UID: \"f292e70b-5ea8-4991-9cea-c4c869f6ebc1\") " Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.686350 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg" (OuterVolumeSpecName: "kube-api-access-wktmg") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "kube-api-access-wktmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.693020 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.710123 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.710908 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.720020 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.724140 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory" (OuterVolumeSpecName: "inventory") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.724456 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.727364 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.728982 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f292e70b-5ea8-4991-9cea-c4c869f6ebc1" (UID: "f292e70b-5ea8-4991-9cea-c4c869f6ebc1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772217 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wktmg\" (UniqueName: \"kubernetes.io/projected/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-kube-api-access-wktmg\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772247 4546 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772257 4546 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772269 4546 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772280 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772290 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772300 4546 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772309 4546 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:23 crc kubenswrapper[4546]: I0201 07:18:23.772318 4546 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f292e70b-5ea8-4991-9cea-c4c869f6ebc1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.096194 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" event={"ID":"f292e70b-5ea8-4991-9cea-c4c869f6ebc1","Type":"ContainerDied","Data":"2708fdeb3e2698e9091066ae2357ec56f303702968b44a6af3c5e0518d77ed5f"} Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.096252 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2708fdeb3e2698e9091066ae2357ec56f303702968b44a6af3c5e0518d77ed5f" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.096263 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m672s" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286300 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz"] Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286762 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="extract-content" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286783 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="extract-content" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286805 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="extract-utilities" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286811 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="extract-utilities" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286824 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="extract-content" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286830 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="extract-content" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286842 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286849 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286874 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286880 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286895 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f292e70b-5ea8-4991-9cea-c4c869f6ebc1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286904 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f292e70b-5ea8-4991-9cea-c4c869f6ebc1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 01 07:18:24 crc kubenswrapper[4546]: E0201 07:18:24.286922 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="extract-utilities" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.286929 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="extract-utilities" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.287127 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78de406-ec23-4235-8e93-cfdd88cd394d" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.287139 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab1e0c2-e025-422e-8523-96e4b51c00a0" containerName="registry-server" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.287150 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f292e70b-5ea8-4991-9cea-c4c869f6ebc1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.287823 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.296896 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.296987 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.297583 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.297993 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pctfm" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.300014 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.300060 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz"] Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.385793 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.385839 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.385965 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.386023 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.386056 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.386181 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7d78\" (UniqueName: \"kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.386286 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487644 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487687 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487736 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487785 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487809 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487908 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7d78\" (UniqueName: \"kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.487992 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.492553 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.492602 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.493394 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.494641 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.496651 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.496840 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.503323 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7d78\" (UniqueName: \"kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:24 crc kubenswrapper[4546]: I0201 07:18:24.608533 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:18:25 crc kubenswrapper[4546]: I0201 07:18:25.128879 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz"] Feb 01 07:18:25 crc kubenswrapper[4546]: I0201 07:18:25.420523 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:18:25 crc kubenswrapper[4546]: I0201 07:18:25.421100 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.035370 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.037508 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.045285 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.116373 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" event={"ID":"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc","Type":"ContainerStarted","Data":"45347cc85d2ad6cd7b29d3260f1d47418f2488eadca6b45f9c3326554c7339fe"} Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.116422 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" event={"ID":"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc","Type":"ContainerStarted","Data":"091129cddcf4bb54d291f847bb3359b4c6b27745f029bcd49b01ebb5b398d146"} Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.132868 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhnr\" (UniqueName: \"kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.133040 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.133265 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.135284 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" podStartSLOduration=1.545634071 podStartE2EDuration="2.13526337s" podCreationTimestamp="2026-02-01 07:18:24 +0000 UTC" firstStartedPulling="2026-02-01 07:18:25.132507784 +0000 UTC m=+2135.783443800" lastFinishedPulling="2026-02-01 07:18:25.722137084 +0000 UTC m=+2136.373073099" observedRunningTime="2026-02-01 07:18:26.128517803 +0000 UTC m=+2136.779453819" watchObservedRunningTime="2026-02-01 07:18:26.13526337 +0000 UTC m=+2136.786199386" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.235077 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhnr\" (UniqueName: \"kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.235140 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.235188 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.235656 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.235719 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.257037 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhnr\" (UniqueName: \"kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr\") pod \"redhat-operators-zd95c\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.353747 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:26 crc kubenswrapper[4546]: I0201 07:18:26.778426 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:26 crc kubenswrapper[4546]: W0201 07:18:26.782810 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda21dcf78_a61e_435e_a89c_7b53e9d3ba5d.slice/crio-d35cff36ced2a87d113a73770733b35431bc2efcf0eb96df689e2b595cfb98bc WatchSource:0}: Error finding container d35cff36ced2a87d113a73770733b35431bc2efcf0eb96df689e2b595cfb98bc: Status 404 returned error can't find the container with id d35cff36ced2a87d113a73770733b35431bc2efcf0eb96df689e2b595cfb98bc Feb 01 07:18:27 crc kubenswrapper[4546]: I0201 07:18:27.129593 4546 generic.go:334] "Generic (PLEG): container finished" podID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerID="67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f" exitCode=0 Feb 01 07:18:27 crc kubenswrapper[4546]: I0201 07:18:27.131290 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerDied","Data":"67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f"} Feb 01 07:18:27 crc kubenswrapper[4546]: I0201 07:18:27.131348 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerStarted","Data":"d35cff36ced2a87d113a73770733b35431bc2efcf0eb96df689e2b595cfb98bc"} Feb 01 07:18:28 crc kubenswrapper[4546]: I0201 07:18:28.144623 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerStarted","Data":"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f"} Feb 01 07:18:31 crc kubenswrapper[4546]: I0201 07:18:31.204006 4546 generic.go:334] "Generic (PLEG): container finished" podID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerID="f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f" exitCode=0 Feb 01 07:18:31 crc kubenswrapper[4546]: I0201 07:18:31.204093 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerDied","Data":"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f"} Feb 01 07:18:32 crc kubenswrapper[4546]: I0201 07:18:32.218641 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerStarted","Data":"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed"} Feb 01 07:18:32 crc kubenswrapper[4546]: I0201 07:18:32.252161 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zd95c" podStartSLOduration=1.599283665 podStartE2EDuration="6.252142943s" podCreationTimestamp="2026-02-01 07:18:26 +0000 UTC" firstStartedPulling="2026-02-01 07:18:27.132789248 +0000 UTC m=+2137.783725264" lastFinishedPulling="2026-02-01 07:18:31.785648526 +0000 UTC m=+2142.436584542" observedRunningTime="2026-02-01 07:18:32.246305897 +0000 UTC m=+2142.897241914" watchObservedRunningTime="2026-02-01 07:18:32.252142943 +0000 UTC m=+2142.903078959" Feb 01 07:18:36 crc kubenswrapper[4546]: I0201 07:18:36.354906 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:36 crc kubenswrapper[4546]: I0201 07:18:36.355691 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:37 crc kubenswrapper[4546]: I0201 07:18:37.403551 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zd95c" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="registry-server" probeResult="failure" output=< Feb 01 07:18:37 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:18:37 crc kubenswrapper[4546]: > Feb 01 07:18:46 crc kubenswrapper[4546]: I0201 07:18:46.401795 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:46 crc kubenswrapper[4546]: I0201 07:18:46.447060 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:46 crc kubenswrapper[4546]: I0201 07:18:46.640816 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.382347 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zd95c" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="registry-server" containerID="cri-o://d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed" gracePeriod=2 Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.782047 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.906421 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhnr\" (UniqueName: \"kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr\") pod \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.906630 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content\") pod \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.906892 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities\") pod \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\" (UID: \"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d\") " Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.907830 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities" (OuterVolumeSpecName: "utilities") pod "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" (UID: "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:18:48 crc kubenswrapper[4546]: I0201 07:18:48.913473 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr" (OuterVolumeSpecName: "kube-api-access-dbhnr") pod "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" (UID: "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d"). InnerVolumeSpecName "kube-api-access-dbhnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.009880 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbhnr\" (UniqueName: \"kubernetes.io/projected/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-kube-api-access-dbhnr\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.009912 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.012405 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" (UID: "a21dcf78-a61e-435e-a89c-7b53e9d3ba5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.113124 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.392536 4546 generic.go:334] "Generic (PLEG): container finished" podID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerID="d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed" exitCode=0 Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.392593 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zd95c" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.392605 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerDied","Data":"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed"} Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.392669 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zd95c" event={"ID":"a21dcf78-a61e-435e-a89c-7b53e9d3ba5d","Type":"ContainerDied","Data":"d35cff36ced2a87d113a73770733b35431bc2efcf0eb96df689e2b595cfb98bc"} Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.392694 4546 scope.go:117] "RemoveContainer" containerID="d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.412785 4546 scope.go:117] "RemoveContainer" containerID="f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.427458 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.442097 4546 scope.go:117] "RemoveContainer" containerID="67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.442347 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zd95c"] Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.475484 4546 scope.go:117] "RemoveContainer" containerID="d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed" Feb 01 07:18:49 crc kubenswrapper[4546]: E0201 07:18:49.476060 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed\": container with ID starting with d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed not found: ID does not exist" containerID="d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.476129 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed"} err="failed to get container status \"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed\": rpc error: code = NotFound desc = could not find container \"d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed\": container with ID starting with d616ce4b6bd1468a9e74c9416420e7cf6e2794e4f962d1999d68ec2f88f6bbed not found: ID does not exist" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.476181 4546 scope.go:117] "RemoveContainer" containerID="f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f" Feb 01 07:18:49 crc kubenswrapper[4546]: E0201 07:18:49.476476 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f\": container with ID starting with f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f not found: ID does not exist" containerID="f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.476505 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f"} err="failed to get container status \"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f\": rpc error: code = NotFound desc = could not find container \"f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f\": container with ID starting with f097d153f39c04dd64c06338d157822c548b6c7b0666d8988550043606248a8f not found: ID does not exist" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.476531 4546 scope.go:117] "RemoveContainer" containerID="67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f" Feb 01 07:18:49 crc kubenswrapper[4546]: E0201 07:18:49.476760 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f\": container with ID starting with 67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f not found: ID does not exist" containerID="67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.476805 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f"} err="failed to get container status \"67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f\": rpc error: code = NotFound desc = could not find container \"67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f\": container with ID starting with 67c49269e28c0556f0560181ce7a7adf87f132e5ccb6d1099848211dc0827e0f not found: ID does not exist" Feb 01 07:18:49 crc kubenswrapper[4546]: I0201 07:18:49.667619 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" path="/var/lib/kubelet/pods/a21dcf78-a61e-435e-a89c-7b53e9d3ba5d/volumes" Feb 01 07:18:55 crc kubenswrapper[4546]: I0201 07:18:55.420402 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:18:55 crc kubenswrapper[4546]: I0201 07:18:55.421118 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.421020 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.421713 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.421772 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.423151 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.423226 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" gracePeriod=600 Feb 01 07:19:25 crc kubenswrapper[4546]: E0201 07:19:25.541840 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.751900 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" exitCode=0 Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.751906 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65"} Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.751969 4546 scope.go:117] "RemoveContainer" containerID="8f9e44d0438a4e3efd7cd015e8a7547952b907067a67777f4cd8c8ea39720319" Feb 01 07:19:25 crc kubenswrapper[4546]: I0201 07:19:25.752796 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:19:25 crc kubenswrapper[4546]: E0201 07:19:25.753310 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:19:37 crc kubenswrapper[4546]: I0201 07:19:37.654983 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:19:37 crc kubenswrapper[4546]: E0201 07:19:37.655933 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:19:50 crc kubenswrapper[4546]: I0201 07:19:50.654773 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:19:50 crc kubenswrapper[4546]: E0201 07:19:50.655732 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:20:05 crc kubenswrapper[4546]: I0201 07:20:05.655516 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:20:05 crc kubenswrapper[4546]: E0201 07:20:05.656886 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.603489 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:08 crc kubenswrapper[4546]: E0201 07:20:08.603917 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="extract-content" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.603933 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="extract-content" Feb 01 07:20:08 crc kubenswrapper[4546]: E0201 07:20:08.603946 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="extract-utilities" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.603953 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="extract-utilities" Feb 01 07:20:08 crc kubenswrapper[4546]: E0201 07:20:08.603966 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="registry-server" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.603972 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="registry-server" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.604203 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21dcf78-a61e-435e-a89c-7b53e9d3ba5d" containerName="registry-server" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.605476 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.645825 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.706800 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.707016 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrkw\" (UniqueName: \"kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.707103 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.809543 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.809954 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnrkw\" (UniqueName: \"kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.809994 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.810028 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.810499 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.829038 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnrkw\" (UniqueName: \"kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw\") pod \"community-operators-vzzgh\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:08 crc kubenswrapper[4546]: I0201 07:20:08.935704 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:09 crc kubenswrapper[4546]: I0201 07:20:09.484592 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:10 crc kubenswrapper[4546]: I0201 07:20:10.180232 4546 generic.go:334] "Generic (PLEG): container finished" podID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerID="fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55" exitCode=0 Feb 01 07:20:10 crc kubenswrapper[4546]: I0201 07:20:10.180375 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerDied","Data":"fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55"} Feb 01 07:20:10 crc kubenswrapper[4546]: I0201 07:20:10.180692 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerStarted","Data":"49b3e957aea4571fb78d834f50c78488f0644461850248952f99777b8cc9e57a"} Feb 01 07:20:12 crc kubenswrapper[4546]: I0201 07:20:12.214909 4546 generic.go:334] "Generic (PLEG): container finished" podID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerID="8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673" exitCode=0 Feb 01 07:20:12 crc kubenswrapper[4546]: I0201 07:20:12.215717 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerDied","Data":"8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673"} Feb 01 07:20:13 crc kubenswrapper[4546]: I0201 07:20:13.229224 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerStarted","Data":"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7"} Feb 01 07:20:18 crc kubenswrapper[4546]: I0201 07:20:18.655758 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:20:18 crc kubenswrapper[4546]: E0201 07:20:18.656767 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:20:18 crc kubenswrapper[4546]: I0201 07:20:18.936442 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:18 crc kubenswrapper[4546]: I0201 07:20:18.936525 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:18 crc kubenswrapper[4546]: I0201 07:20:18.975816 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:18 crc kubenswrapper[4546]: I0201 07:20:18.996617 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vzzgh" podStartSLOduration=8.461781792 podStartE2EDuration="10.996586181s" podCreationTimestamp="2026-02-01 07:20:08 +0000 UTC" firstStartedPulling="2026-02-01 07:20:10.182917349 +0000 UTC m=+2240.833853365" lastFinishedPulling="2026-02-01 07:20:12.717721738 +0000 UTC m=+2243.368657754" observedRunningTime="2026-02-01 07:20:13.256437718 +0000 UTC m=+2243.907373734" watchObservedRunningTime="2026-02-01 07:20:18.996586181 +0000 UTC m=+2249.647522197" Feb 01 07:20:19 crc kubenswrapper[4546]: I0201 07:20:19.317737 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:19 crc kubenswrapper[4546]: I0201 07:20:19.367347 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.301524 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vzzgh" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="registry-server" containerID="cri-o://76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7" gracePeriod=2 Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.743890 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.943613 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content\") pod \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.943667 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnrkw\" (UniqueName: \"kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw\") pod \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.943893 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities\") pod \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\" (UID: \"d8a3178e-34d2-4b5d-ade9-5b5a44a55705\") " Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.944531 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities" (OuterVolumeSpecName: "utilities") pod "d8a3178e-34d2-4b5d-ade9-5b5a44a55705" (UID: "d8a3178e-34d2-4b5d-ade9-5b5a44a55705"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.945688 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:21 crc kubenswrapper[4546]: I0201 07:20:21.952341 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw" (OuterVolumeSpecName: "kube-api-access-pnrkw") pod "d8a3178e-34d2-4b5d-ade9-5b5a44a55705" (UID: "d8a3178e-34d2-4b5d-ade9-5b5a44a55705"). InnerVolumeSpecName "kube-api-access-pnrkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.030261 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8a3178e-34d2-4b5d-ade9-5b5a44a55705" (UID: "d8a3178e-34d2-4b5d-ade9-5b5a44a55705"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.048602 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.048628 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnrkw\" (UniqueName: \"kubernetes.io/projected/d8a3178e-34d2-4b5d-ade9-5b5a44a55705-kube-api-access-pnrkw\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.319995 4546 generic.go:334] "Generic (PLEG): container finished" podID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerID="76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7" exitCode=0 Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.320064 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerDied","Data":"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7"} Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.320124 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vzzgh" event={"ID":"d8a3178e-34d2-4b5d-ade9-5b5a44a55705","Type":"ContainerDied","Data":"49b3e957aea4571fb78d834f50c78488f0644461850248952f99777b8cc9e57a"} Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.320150 4546 scope.go:117] "RemoveContainer" containerID="76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.320147 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vzzgh" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.356651 4546 scope.go:117] "RemoveContainer" containerID="8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.382934 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.383240 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vzzgh"] Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.416286 4546 scope.go:117] "RemoveContainer" containerID="fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.437746 4546 scope.go:117] "RemoveContainer" containerID="76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7" Feb 01 07:20:22 crc kubenswrapper[4546]: E0201 07:20:22.438516 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7\": container with ID starting with 76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7 not found: ID does not exist" containerID="76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.438591 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7"} err="failed to get container status \"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7\": rpc error: code = NotFound desc = could not find container \"76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7\": container with ID starting with 76d605c7a73d3c2e2338b28d613ddc32500b67003e90bb7d0356574520b575b7 not found: ID does not exist" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.438807 4546 scope.go:117] "RemoveContainer" containerID="8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673" Feb 01 07:20:22 crc kubenswrapper[4546]: E0201 07:20:22.439563 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673\": container with ID starting with 8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673 not found: ID does not exist" containerID="8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.439617 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673"} err="failed to get container status \"8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673\": rpc error: code = NotFound desc = could not find container \"8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673\": container with ID starting with 8ff68d7b153c40a9989059f5bac0106aa7c4fdd1837bc143643ca32fe0305673 not found: ID does not exist" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.439633 4546 scope.go:117] "RemoveContainer" containerID="fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55" Feb 01 07:20:22 crc kubenswrapper[4546]: E0201 07:20:22.440242 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55\": container with ID starting with fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55 not found: ID does not exist" containerID="fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55" Feb 01 07:20:22 crc kubenswrapper[4546]: I0201 07:20:22.440265 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55"} err="failed to get container status \"fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55\": rpc error: code = NotFound desc = could not find container \"fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55\": container with ID starting with fd01c20eea9eeae8fe7f9e857ebddd4036f50e520be3ea707fc09c99c7dc7a55 not found: ID does not exist" Feb 01 07:20:22 crc kubenswrapper[4546]: E0201 07:20:22.458248 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a3178e_34d2_4b5d_ade9_5b5a44a55705.slice\": RecentStats: unable to find data in memory cache]" Feb 01 07:20:23 crc kubenswrapper[4546]: I0201 07:20:23.667709 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" path="/var/lib/kubelet/pods/d8a3178e-34d2-4b5d-ade9-5b5a44a55705/volumes" Feb 01 07:20:29 crc kubenswrapper[4546]: I0201 07:20:29.662692 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:20:29 crc kubenswrapper[4546]: E0201 07:20:29.664199 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:20:37 crc kubenswrapper[4546]: I0201 07:20:37.472062 4546 generic.go:334] "Generic (PLEG): container finished" podID="55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" containerID="45347cc85d2ad6cd7b29d3260f1d47418f2488eadca6b45f9c3326554c7339fe" exitCode=0 Feb 01 07:20:37 crc kubenswrapper[4546]: I0201 07:20:37.472149 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" event={"ID":"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc","Type":"ContainerDied","Data":"45347cc85d2ad6cd7b29d3260f1d47418f2488eadca6b45f9c3326554c7339fe"} Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.883951 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.978902 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979064 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979105 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7d78\" (UniqueName: \"kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979197 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979279 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979338 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.979378 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory\") pod \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\" (UID: \"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc\") " Feb 01 07:20:38 crc kubenswrapper[4546]: I0201 07:20:38.987322 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78" (OuterVolumeSpecName: "kube-api-access-z7d78") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "kube-api-access-z7d78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.006007 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.008797 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.008875 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory" (OuterVolumeSpecName: "inventory") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.010494 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.015034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.024423 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" (UID: "55a0a6af-1383-4d40-a07a-a9ec09fc3fbc"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083525 4546 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083571 4546 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083584 4546 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083595 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083606 4546 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083617 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7d78\" (UniqueName: \"kubernetes.io/projected/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-kube-api-access-z7d78\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.083627 4546 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a0a6af-1383-4d40-a07a-a9ec09fc3fbc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.491434 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" event={"ID":"55a0a6af-1383-4d40-a07a-a9ec09fc3fbc","Type":"ContainerDied","Data":"091129cddcf4bb54d291f847bb3359b4c6b27745f029bcd49b01ebb5b398d146"} Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.491509 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="091129cddcf4bb54d291f847bb3359b4c6b27745f029bcd49b01ebb5b398d146" Feb 01 07:20:39 crc kubenswrapper[4546]: I0201 07:20:39.491528 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mbvnz" Feb 01 07:20:41 crc kubenswrapper[4546]: I0201 07:20:41.655310 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:20:41 crc kubenswrapper[4546]: E0201 07:20:41.656116 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:20:53 crc kubenswrapper[4546]: I0201 07:20:53.655138 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:20:53 crc kubenswrapper[4546]: E0201 07:20:53.656100 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:21:04 crc kubenswrapper[4546]: I0201 07:21:04.655089 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:21:04 crc kubenswrapper[4546]: E0201 07:21:04.656441 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:21:19 crc kubenswrapper[4546]: I0201 07:21:19.661512 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:21:19 crc kubenswrapper[4546]: E0201 07:21:19.662797 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.655151 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.655274 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 01 07:21:30 crc kubenswrapper[4546]: E0201 07:21:30.656202 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:21:30 crc kubenswrapper[4546]: E0201 07:21:30.656480 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="extract-content" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656501 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="extract-content" Feb 01 07:21:30 crc kubenswrapper[4546]: E0201 07:21:30.656543 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="registry-server" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656550 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="registry-server" Feb 01 07:21:30 crc kubenswrapper[4546]: E0201 07:21:30.656564 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656572 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 01 07:21:30 crc kubenswrapper[4546]: E0201 07:21:30.656585 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="extract-utilities" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656593 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="extract-utilities" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656808 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a0a6af-1383-4d40-a07a-a9ec09fc3fbc" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.656822 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3178e-34d2-4b5d-ade9-5b5a44a55705" containerName="registry-server" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.657609 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.660377 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.668759 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7sw8w" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.669026 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.670266 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.671069 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.744346 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.744488 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.744512 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.850790 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.850845 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.850941 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.850990 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.851035 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.851070 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.851123 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjtlz\" (UniqueName: \"kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.851198 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.851231 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.852054 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.852550 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.860426 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.954196 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.954609 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.954582 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.955548 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.955668 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.955772 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.955950 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjtlz\" (UniqueName: \"kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.956002 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.957881 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.959266 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.959789 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.972374 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjtlz\" (UniqueName: \"kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:30 crc kubenswrapper[4546]: I0201 07:21:30.980218 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s00-multi-thread-testing\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:31 crc kubenswrapper[4546]: I0201 07:21:31.281245 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 07:21:31 crc kubenswrapper[4546]: I0201 07:21:31.804272 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-multi-thread-testing"] Feb 01 07:21:32 crc kubenswrapper[4546]: I0201 07:21:32.084083 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"d08de2e0-03ce-4faf-84fa-45d69c70a38b","Type":"ContainerStarted","Data":"cf786f664c054b5666e270e0dccc98c16f486d6afc268888e98e95a6704fc19f"} Feb 01 07:21:42 crc kubenswrapper[4546]: I0201 07:21:42.655095 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:21:42 crc kubenswrapper[4546]: E0201 07:21:42.656136 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:21:56 crc kubenswrapper[4546]: I0201 07:21:56.655629 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:21:56 crc kubenswrapper[4546]: E0201 07:21:56.656773 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:22:07 crc kubenswrapper[4546]: I0201 07:22:07.662737 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:22:07 crc kubenswrapper[4546]: E0201 07:22:07.663804 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:22:18 crc kubenswrapper[4546]: I0201 07:22:18.656404 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:22:18 crc kubenswrapper[4546]: E0201 07:22:18.657085 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:22:19 crc kubenswrapper[4546]: E0201 07:22:19.060549 4546 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 07:22:19 crc kubenswrapper[4546]: E0201 07:22:19.060641 4546 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 07:22:19 crc kubenswrapper[4546]: E0201 07:22:19.062749 4546 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hjtlz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-multi-thread-testing_openstack(d08de2e0-03ce-4faf-84fa-45d69c70a38b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:22:19 crc kubenswrapper[4546]: E0201 07:22:19.063955 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" Feb 01 07:22:19 crc kubenswrapper[4546]: E0201 07:22:19.631017 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podUID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" Feb 01 07:22:31 crc kubenswrapper[4546]: I0201 07:22:31.655846 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:22:31 crc kubenswrapper[4546]: E0201 07:22:31.657058 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:22:34 crc kubenswrapper[4546]: I0201 07:22:34.659224 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:22:35 crc kubenswrapper[4546]: I0201 07:22:35.379432 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 01 07:22:36 crc kubenswrapper[4546]: I0201 07:22:36.801991 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"d08de2e0-03ce-4faf-84fa-45d69c70a38b","Type":"ContainerStarted","Data":"c93b3611c4519ecf1f1de2a36e96b22725af693d46fb02cd89314b7a502a3562"} Feb 01 07:22:36 crc kubenswrapper[4546]: I0201 07:22:36.824906 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" podStartSLOduration=4.253831186 podStartE2EDuration="1m7.824875877s" podCreationTimestamp="2026-02-01 07:21:29 +0000 UTC" firstStartedPulling="2026-02-01 07:21:31.804174943 +0000 UTC m=+2322.455110959" lastFinishedPulling="2026-02-01 07:22:35.375219634 +0000 UTC m=+2386.026155650" observedRunningTime="2026-02-01 07:22:36.820462917 +0000 UTC m=+2387.471398932" watchObservedRunningTime="2026-02-01 07:22:36.824875877 +0000 UTC m=+2387.475811893" Feb 01 07:22:43 crc kubenswrapper[4546]: I0201 07:22:43.655606 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:22:43 crc kubenswrapper[4546]: E0201 07:22:43.656914 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:22:57 crc kubenswrapper[4546]: I0201 07:22:57.655560 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:22:57 crc kubenswrapper[4546]: E0201 07:22:57.656312 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:23:10 crc kubenswrapper[4546]: I0201 07:23:10.656154 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:23:10 crc kubenswrapper[4546]: E0201 07:23:10.657486 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:23:24 crc kubenswrapper[4546]: I0201 07:23:24.654754 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:23:24 crc kubenswrapper[4546]: E0201 07:23:24.655678 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:23:35 crc kubenswrapper[4546]: I0201 07:23:35.655706 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:23:35 crc kubenswrapper[4546]: E0201 07:23:35.656872 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:23:47 crc kubenswrapper[4546]: I0201 07:23:47.654658 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:23:47 crc kubenswrapper[4546]: E0201 07:23:47.655586 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:24:02 crc kubenswrapper[4546]: I0201 07:24:02.654939 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:24:02 crc kubenswrapper[4546]: E0201 07:24:02.655836 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:24:15 crc kubenswrapper[4546]: I0201 07:24:15.659505 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:24:15 crc kubenswrapper[4546]: E0201 07:24:15.664416 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:24:20 crc kubenswrapper[4546]: E0201 07:24:20.760572 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:38208->192.168.26.196:40843: write tcp 192.168.26.196:38208->192.168.26.196:40843: write: broken pipe Feb 01 07:24:30 crc kubenswrapper[4546]: I0201 07:24:30.655460 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:24:30 crc kubenswrapper[4546]: I0201 07:24:30.975370 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606"} Feb 01 07:26:55 crc kubenswrapper[4546]: I0201 07:26:55.420713 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:26:55 crc kubenswrapper[4546]: I0201 07:26:55.421451 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.712841 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.722433 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.736647 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.737561 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzl4\" (UniqueName: \"kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.738036 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.763367 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.841962 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.842158 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.842233 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzl4\" (UniqueName: \"kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.845325 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.846018 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:20 crc kubenswrapper[4546]: I0201 07:27:20.872705 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzl4\" (UniqueName: \"kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4\") pod \"certified-operators-xndrh\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:21 crc kubenswrapper[4546]: I0201 07:27:21.046150 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:22 crc kubenswrapper[4546]: I0201 07:27:22.000029 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:22 crc kubenswrapper[4546]: I0201 07:27:22.637021 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerDied","Data":"de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f"} Feb 01 07:27:22 crc kubenswrapper[4546]: I0201 07:27:22.637266 4546 generic.go:334] "Generic (PLEG): container finished" podID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerID="de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f" exitCode=0 Feb 01 07:27:22 crc kubenswrapper[4546]: I0201 07:27:22.637592 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerStarted","Data":"40ec28f9f296ff3b133f7f31c77acd9206f9a6616a9044dcd2bf1b1f7e787a68"} Feb 01 07:27:24 crc kubenswrapper[4546]: I0201 07:27:24.658456 4546 generic.go:334] "Generic (PLEG): container finished" podID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerID="332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019" exitCode=0 Feb 01 07:27:24 crc kubenswrapper[4546]: I0201 07:27:24.658582 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerDied","Data":"332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019"} Feb 01 07:27:25 crc kubenswrapper[4546]: I0201 07:27:25.420730 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:27:25 crc kubenswrapper[4546]: I0201 07:27:25.421247 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:25 crc kubenswrapper[4546]: I0201 07:27:25.672744 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerStarted","Data":"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec"} Feb 01 07:27:25 crc kubenswrapper[4546]: I0201 07:27:25.705190 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xndrh" podStartSLOduration=3.216160772 podStartE2EDuration="5.701420771s" podCreationTimestamp="2026-02-01 07:27:20 +0000 UTC" firstStartedPulling="2026-02-01 07:27:22.639047817 +0000 UTC m=+2673.289983833" lastFinishedPulling="2026-02-01 07:27:25.124307815 +0000 UTC m=+2675.775243832" observedRunningTime="2026-02-01 07:27:25.691792897 +0000 UTC m=+2676.342728914" watchObservedRunningTime="2026-02-01 07:27:25.701420771 +0000 UTC m=+2676.352356788" Feb 01 07:27:31 crc kubenswrapper[4546]: I0201 07:27:31.052895 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:31 crc kubenswrapper[4546]: I0201 07:27:31.054281 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:32 crc kubenswrapper[4546]: I0201 07:27:32.130637 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xndrh" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="registry-server" probeResult="failure" output=< Feb 01 07:27:32 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:27:32 crc kubenswrapper[4546]: > Feb 01 07:27:41 crc kubenswrapper[4546]: I0201 07:27:41.101771 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:41 crc kubenswrapper[4546]: I0201 07:27:41.146128 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:41 crc kubenswrapper[4546]: I0201 07:27:41.342937 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:42 crc kubenswrapper[4546]: I0201 07:27:42.844307 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xndrh" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="registry-server" containerID="cri-o://90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec" gracePeriod=2 Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.635628 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.732406 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content\") pod \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.732533 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities\") pod \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.732696 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dzl4\" (UniqueName: \"kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4\") pod \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\" (UID: \"66d15b4f-8f7f-4008-86e5-d0b74d308f29\") " Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.735127 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities" (OuterVolumeSpecName: "utilities") pod "66d15b4f-8f7f-4008-86e5-d0b74d308f29" (UID: "66d15b4f-8f7f-4008-86e5-d0b74d308f29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.744357 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4" (OuterVolumeSpecName: "kube-api-access-7dzl4") pod "66d15b4f-8f7f-4008-86e5-d0b74d308f29" (UID: "66d15b4f-8f7f-4008-86e5-d0b74d308f29"). InnerVolumeSpecName "kube-api-access-7dzl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.816212 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66d15b4f-8f7f-4008-86e5-d0b74d308f29" (UID: "66d15b4f-8f7f-4008-86e5-d0b74d308f29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.834694 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.834731 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d15b4f-8f7f-4008-86e5-d0b74d308f29-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.834745 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dzl4\" (UniqueName: \"kubernetes.io/projected/66d15b4f-8f7f-4008-86e5-d0b74d308f29-kube-api-access-7dzl4\") on node \"crc\" DevicePath \"\"" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.857235 4546 generic.go:334] "Generic (PLEG): container finished" podID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerID="90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec" exitCode=0 Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.857298 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerDied","Data":"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec"} Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.857339 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xndrh" event={"ID":"66d15b4f-8f7f-4008-86e5-d0b74d308f29","Type":"ContainerDied","Data":"40ec28f9f296ff3b133f7f31c77acd9206f9a6616a9044dcd2bf1b1f7e787a68"} Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.857578 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xndrh" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.860881 4546 scope.go:117] "RemoveContainer" containerID="90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.901871 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.906173 4546 scope.go:117] "RemoveContainer" containerID="332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.923943 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xndrh"] Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.936558 4546 scope.go:117] "RemoveContainer" containerID="de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.975934 4546 scope.go:117] "RemoveContainer" containerID="90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec" Feb 01 07:27:43 crc kubenswrapper[4546]: E0201 07:27:43.978041 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec\": container with ID starting with 90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec not found: ID does not exist" containerID="90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.978655 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec"} err="failed to get container status \"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec\": rpc error: code = NotFound desc = could not find container \"90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec\": container with ID starting with 90735085e3d9ac2ada123c694e43863c9146b47b113d33485509421441ab92ec not found: ID does not exist" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.978708 4546 scope.go:117] "RemoveContainer" containerID="332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019" Feb 01 07:27:43 crc kubenswrapper[4546]: E0201 07:27:43.979171 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019\": container with ID starting with 332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019 not found: ID does not exist" containerID="332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.979221 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019"} err="failed to get container status \"332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019\": rpc error: code = NotFound desc = could not find container \"332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019\": container with ID starting with 332077bbfc2b2e5b57f62a70b4fd192cf23b52e4ab061ad0da5a0bd20aedb019 not found: ID does not exist" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.979254 4546 scope.go:117] "RemoveContainer" containerID="de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f" Feb 01 07:27:43 crc kubenswrapper[4546]: E0201 07:27:43.979575 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f\": container with ID starting with de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f not found: ID does not exist" containerID="de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f" Feb 01 07:27:43 crc kubenswrapper[4546]: I0201 07:27:43.979670 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f"} err="failed to get container status \"de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f\": rpc error: code = NotFound desc = could not find container \"de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f\": container with ID starting with de2d9ee2bf030b582b687a628622fa1379806ce0f4fd26882f156bb2bacbb33f not found: ID does not exist" Feb 01 07:27:45 crc kubenswrapper[4546]: I0201 07:27:45.664306 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" path="/var/lib/kubelet/pods/66d15b4f-8f7f-4008-86e5-d0b74d308f29/volumes" Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.420492 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.421229 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.421287 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.422580 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.422645 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606" gracePeriod=600 Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.981829 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606" exitCode=0 Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.981906 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606"} Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.982331 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008"} Feb 01 07:27:55 crc kubenswrapper[4546]: I0201 07:27:55.982372 4546 scope.go:117] "RemoveContainer" containerID="f9485c792b4df68be6fb5a032ee9232a19dd205e7a86cb75f06fc67fea4d8d65" Feb 01 07:28:13 crc kubenswrapper[4546]: E0201 07:28:13.335523 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:57450->192.168.26.196:40843: write tcp 192.168.26.196:57450->192.168.26.196:40843: write: broken pipe Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.267642 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:06 crc kubenswrapper[4546]: E0201 07:29:06.273736 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="registry-server" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.273775 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="registry-server" Feb 01 07:29:06 crc kubenswrapper[4546]: E0201 07:29:06.273820 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="extract-content" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.273828 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="extract-content" Feb 01 07:29:06 crc kubenswrapper[4546]: E0201 07:29:06.273846 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="extract-utilities" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.273865 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="extract-utilities" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.274901 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d15b4f-8f7f-4008-86e5-d0b74d308f29" containerName="registry-server" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.279385 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.364133 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.364412 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.364593 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nt7x\" (UniqueName: \"kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.377979 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.467352 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.467464 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nt7x\" (UniqueName: \"kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.467774 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.469480 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.469528 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.511516 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nt7x\" (UniqueName: \"kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x\") pod \"redhat-operators-vsh2w\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:06 crc kubenswrapper[4546]: I0201 07:29:06.605799 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:07 crc kubenswrapper[4546]: I0201 07:29:07.475395 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:07 crc kubenswrapper[4546]: I0201 07:29:07.758325 4546 generic.go:334] "Generic (PLEG): container finished" podID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerID="d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152" exitCode=0 Feb 01 07:29:07 crc kubenswrapper[4546]: I0201 07:29:07.758442 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerDied","Data":"d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152"} Feb 01 07:29:07 crc kubenswrapper[4546]: I0201 07:29:07.758690 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerStarted","Data":"bc7fcd4989922d707ebe582cf65302424212a5afdf78189906e0185361588ab2"} Feb 01 07:29:07 crc kubenswrapper[4546]: I0201 07:29:07.763953 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:29:08 crc kubenswrapper[4546]: I0201 07:29:08.768108 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerStarted","Data":"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9"} Feb 01 07:29:11 crc kubenswrapper[4546]: I0201 07:29:11.828686 4546 generic.go:334] "Generic (PLEG): container finished" podID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerID="d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9" exitCode=0 Feb 01 07:29:11 crc kubenswrapper[4546]: I0201 07:29:11.828774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerDied","Data":"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9"} Feb 01 07:29:12 crc kubenswrapper[4546]: I0201 07:29:12.845148 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerStarted","Data":"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d"} Feb 01 07:29:12 crc kubenswrapper[4546]: I0201 07:29:12.866391 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vsh2w" podStartSLOduration=2.317663016 podStartE2EDuration="6.864882693s" podCreationTimestamp="2026-02-01 07:29:06 +0000 UTC" firstStartedPulling="2026-02-01 07:29:07.760324517 +0000 UTC m=+2778.411260532" lastFinishedPulling="2026-02-01 07:29:12.307544193 +0000 UTC m=+2782.958480209" observedRunningTime="2026-02-01 07:29:12.863942982 +0000 UTC m=+2783.514879018" watchObservedRunningTime="2026-02-01 07:29:12.864882693 +0000 UTC m=+2783.515818710" Feb 01 07:29:16 crc kubenswrapper[4546]: I0201 07:29:16.606671 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:16 crc kubenswrapper[4546]: I0201 07:29:16.607231 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:17 crc kubenswrapper[4546]: I0201 07:29:17.648486 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vsh2w" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" probeResult="failure" output=< Feb 01 07:29:17 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:29:17 crc kubenswrapper[4546]: > Feb 01 07:29:27 crc kubenswrapper[4546]: I0201 07:29:27.654991 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vsh2w" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" probeResult="failure" output=< Feb 01 07:29:27 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:29:27 crc kubenswrapper[4546]: > Feb 01 07:29:36 crc kubenswrapper[4546]: I0201 07:29:36.682310 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:36 crc kubenswrapper[4546]: I0201 07:29:36.733675 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:37 crc kubenswrapper[4546]: I0201 07:29:37.444326 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:38 crc kubenswrapper[4546]: I0201 07:29:38.124128 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vsh2w" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" containerID="cri-o://8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d" gracePeriod=2 Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.085303 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.135955 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsh2w" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.136019 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerDied","Data":"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d"} Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.136543 4546 generic.go:334] "Generic (PLEG): container finished" podID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerID="8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d" exitCode=0 Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.136602 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsh2w" event={"ID":"a323ba64-bd5d-42fb-a29f-521dd01c8895","Type":"ContainerDied","Data":"bc7fcd4989922d707ebe582cf65302424212a5afdf78189906e0185361588ab2"} Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.136903 4546 scope.go:117] "RemoveContainer" containerID="8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.174069 4546 scope.go:117] "RemoveContainer" containerID="d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.201642 4546 scope.go:117] "RemoveContainer" containerID="d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.218358 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities\") pod \"a323ba64-bd5d-42fb-a29f-521dd01c8895\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.218508 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nt7x\" (UniqueName: \"kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x\") pod \"a323ba64-bd5d-42fb-a29f-521dd01c8895\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.218599 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content\") pod \"a323ba64-bd5d-42fb-a29f-521dd01c8895\" (UID: \"a323ba64-bd5d-42fb-a29f-521dd01c8895\") " Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.219674 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities" (OuterVolumeSpecName: "utilities") pod "a323ba64-bd5d-42fb-a29f-521dd01c8895" (UID: "a323ba64-bd5d-42fb-a29f-521dd01c8895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.239085 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x" (OuterVolumeSpecName: "kube-api-access-5nt7x") pod "a323ba64-bd5d-42fb-a29f-521dd01c8895" (UID: "a323ba64-bd5d-42fb-a29f-521dd01c8895"). InnerVolumeSpecName "kube-api-access-5nt7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.240282 4546 scope.go:117] "RemoveContainer" containerID="8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d" Feb 01 07:29:39 crc kubenswrapper[4546]: E0201 07:29:39.244514 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d\": container with ID starting with 8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d not found: ID does not exist" containerID="8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.245158 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d"} err="failed to get container status \"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d\": rpc error: code = NotFound desc = could not find container \"8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d\": container with ID starting with 8a91c64e12c36df271ad23f3aa4009886a69edb70d8563c778a7456f8650c56d not found: ID does not exist" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.245193 4546 scope.go:117] "RemoveContainer" containerID="d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9" Feb 01 07:29:39 crc kubenswrapper[4546]: E0201 07:29:39.245556 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9\": container with ID starting with d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9 not found: ID does not exist" containerID="d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.245582 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9"} err="failed to get container status \"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9\": rpc error: code = NotFound desc = could not find container \"d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9\": container with ID starting with d5567c5d13ab94458c4ba6d444660e24480e86da06c48dcd96dea1585c5e78b9 not found: ID does not exist" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.245595 4546 scope.go:117] "RemoveContainer" containerID="d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152" Feb 01 07:29:39 crc kubenswrapper[4546]: E0201 07:29:39.245959 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152\": container with ID starting with d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152 not found: ID does not exist" containerID="d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.245980 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152"} err="failed to get container status \"d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152\": rpc error: code = NotFound desc = could not find container \"d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152\": container with ID starting with d9337f38144e6f09c12a48ee0912685dc220e3fb7db764878684007c3fd71152 not found: ID does not exist" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.323300 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.323339 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nt7x\" (UniqueName: \"kubernetes.io/projected/a323ba64-bd5d-42fb-a29f-521dd01c8895-kube-api-access-5nt7x\") on node \"crc\" DevicePath \"\"" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.363703 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a323ba64-bd5d-42fb-a29f-521dd01c8895" (UID: "a323ba64-bd5d-42fb-a29f-521dd01c8895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.427466 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a323ba64-bd5d-42fb-a29f-521dd01c8895-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.469742 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.475747 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vsh2w"] Feb 01 07:29:39 crc kubenswrapper[4546]: I0201 07:29:39.707267 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" path="/var/lib/kubelet/pods/a323ba64-bd5d-42fb-a29f-521dd01c8895/volumes" Feb 01 07:29:55 crc kubenswrapper[4546]: I0201 07:29:55.423273 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:29:55 crc kubenswrapper[4546]: I0201 07:29:55.425149 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.763926 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc"] Feb 01 07:30:00 crc kubenswrapper[4546]: E0201 07:30:00.766621 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="extract-utilities" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.766650 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="extract-utilities" Feb 01 07:30:00 crc kubenswrapper[4546]: E0201 07:30:00.767101 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.767120 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[4546]: E0201 07:30:00.767163 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="extract-content" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.767169 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="extract-content" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.767904 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a323ba64-bd5d-42fb-a29f-521dd01c8895" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.772239 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.810347 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.811421 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.877796 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.877899 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.878031 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2kh\" (UniqueName: \"kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.928612 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc"] Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.980703 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.980814 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.981101 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2kh\" (UniqueName: \"kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:00 crc kubenswrapper[4546]: I0201 07:30:00.985266 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:01 crc kubenswrapper[4546]: I0201 07:30:01.007731 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:01 crc kubenswrapper[4546]: I0201 07:30:01.008280 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2kh\" (UniqueName: \"kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh\") pod \"collect-profiles-29498850-lfmzc\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:01 crc kubenswrapper[4546]: I0201 07:30:01.111747 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:01 crc kubenswrapper[4546]: I0201 07:30:01.897869 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc"] Feb 01 07:30:02 crc kubenswrapper[4546]: I0201 07:30:02.370213 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" event={"ID":"d72617a1-043e-418e-906c-41c594b4708c","Type":"ContainerStarted","Data":"836ab4957a933756f4fd2270dd11e1ecb73e65abf0c6bf865774dc134c2f86f6"} Feb 01 07:30:02 crc kubenswrapper[4546]: I0201 07:30:02.370487 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" event={"ID":"d72617a1-043e-418e-906c-41c594b4708c","Type":"ContainerStarted","Data":"dd0a22e2b9c1277b6a8bb413bc1d16e71c20f88095f75787ecc6ef11eaf6d697"} Feb 01 07:30:02 crc kubenswrapper[4546]: I0201 07:30:02.383499 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" podStartSLOduration=2.383016553 podStartE2EDuration="2.383016553s" podCreationTimestamp="2026-02-01 07:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:30:02.382324058 +0000 UTC m=+2833.033260073" watchObservedRunningTime="2026-02-01 07:30:02.383016553 +0000 UTC m=+2833.033952569" Feb 01 07:30:03 crc kubenswrapper[4546]: I0201 07:30:03.382426 4546 generic.go:334] "Generic (PLEG): container finished" podID="d72617a1-043e-418e-906c-41c594b4708c" containerID="836ab4957a933756f4fd2270dd11e1ecb73e65abf0c6bf865774dc134c2f86f6" exitCode=0 Feb 01 07:30:03 crc kubenswrapper[4546]: I0201 07:30:03.382502 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" event={"ID":"d72617a1-043e-418e-906c-41c594b4708c","Type":"ContainerDied","Data":"836ab4957a933756f4fd2270dd11e1ecb73e65abf0c6bf865774dc134c2f86f6"} Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.838960 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.970067 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp2kh\" (UniqueName: \"kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh\") pod \"d72617a1-043e-418e-906c-41c594b4708c\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.970598 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume\") pod \"d72617a1-043e-418e-906c-41c594b4708c\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.970884 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume\") pod \"d72617a1-043e-418e-906c-41c594b4708c\" (UID: \"d72617a1-043e-418e-906c-41c594b4708c\") " Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.972021 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr"] Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.973123 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d72617a1-043e-418e-906c-41c594b4708c" (UID: "d72617a1-043e-418e-906c-41c594b4708c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.980760 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh" (OuterVolumeSpecName: "kube-api-access-zp2kh") pod "d72617a1-043e-418e-906c-41c594b4708c" (UID: "d72617a1-043e-418e-906c-41c594b4708c"). InnerVolumeSpecName "kube-api-access-zp2kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.988441 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-t8nbr"] Feb 01 07:30:04 crc kubenswrapper[4546]: I0201 07:30:04.998488 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d72617a1-043e-418e-906c-41c594b4708c" (UID: "d72617a1-043e-418e-906c-41c594b4708c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.074838 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp2kh\" (UniqueName: \"kubernetes.io/projected/d72617a1-043e-418e-906c-41c594b4708c-kube-api-access-zp2kh\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.074901 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d72617a1-043e-418e-906c-41c594b4708c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.074914 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d72617a1-043e-418e-906c-41c594b4708c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.400748 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" event={"ID":"d72617a1-043e-418e-906c-41c594b4708c","Type":"ContainerDied","Data":"dd0a22e2b9c1277b6a8bb413bc1d16e71c20f88095f75787ecc6ef11eaf6d697"} Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.400808 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0a22e2b9c1277b6a8bb413bc1d16e71c20f88095f75787ecc6ef11eaf6d697" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.400919 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc" Feb 01 07:30:05 crc kubenswrapper[4546]: I0201 07:30:05.667049 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd9b242-1e4e-46f9-b8fb-04175b46cf9a" path="/var/lib/kubelet/pods/dfd9b242-1e4e-46f9-b8fb-04175b46cf9a/volumes" Feb 01 07:30:25 crc kubenswrapper[4546]: I0201 07:30:25.421882 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:30:25 crc kubenswrapper[4546]: I0201 07:30:25.423344 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.420745 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.421451 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.422413 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.424490 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.425001 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" gracePeriod=600 Feb 01 07:30:55 crc kubenswrapper[4546]: E0201 07:30:55.561746 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.894594 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" exitCode=0 Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.895400 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008"} Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.897135 4546 scope.go:117] "RemoveContainer" containerID="ddbfbd2ce5dde2db2044ebf043da0d70bc816e954342544429f141f0b757a606" Feb 01 07:30:55 crc kubenswrapper[4546]: I0201 07:30:55.897238 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:30:55 crc kubenswrapper[4546]: E0201 07:30:55.897616 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:30:57 crc kubenswrapper[4546]: I0201 07:30:57.462909 4546 scope.go:117] "RemoveContainer" containerID="933e09563ae018cedf41d60679f6ecf138c654b345a755f567f11bc247d7d4ba" Feb 01 07:31:10 crc kubenswrapper[4546]: I0201 07:31:10.656166 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:31:10 crc kubenswrapper[4546]: E0201 07:31:10.657622 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:31:21 crc kubenswrapper[4546]: I0201 07:31:21.655537 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:31:21 crc kubenswrapper[4546]: E0201 07:31:21.657029 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:31:35 crc kubenswrapper[4546]: I0201 07:31:35.656014 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:31:35 crc kubenswrapper[4546]: E0201 07:31:35.656953 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:31:46 crc kubenswrapper[4546]: I0201 07:31:46.655778 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:31:46 crc kubenswrapper[4546]: E0201 07:31:46.656615 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:31:59 crc kubenswrapper[4546]: I0201 07:31:59.666527 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:31:59 crc kubenswrapper[4546]: E0201 07:31:59.668076 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.063188 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:11 crc kubenswrapper[4546]: E0201 07:32:11.067585 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72617a1-043e-418e-906c-41c594b4708c" containerName="collect-profiles" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.067849 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72617a1-043e-418e-906c-41c594b4708c" containerName="collect-profiles" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.069191 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72617a1-043e-418e-906c-41c594b4708c" containerName="collect-profiles" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.073465 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.185933 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.186373 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.186624 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vpq\" (UniqueName: \"kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.282535 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.289050 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.289111 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vpq\" (UniqueName: \"kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.289182 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.297104 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.297517 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.331655 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vpq\" (UniqueName: \"kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq\") pod \"community-operators-x29ns\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:11 crc kubenswrapper[4546]: I0201 07:32:11.399973 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:12 crc kubenswrapper[4546]: I0201 07:32:12.398646 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:12 crc kubenswrapper[4546]: I0201 07:32:12.622597 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerStarted","Data":"ced74c15e149e15f3c8534efa587f335fc0bd41335213232c6ee6c94af638602"} Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.372954 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.380192 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.402279 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.471072 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.471123 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.471897 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jfp\" (UniqueName: \"kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.573471 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.573509 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.573619 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jfp\" (UniqueName: \"kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.576449 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.577275 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.593915 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jfp\" (UniqueName: \"kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp\") pod \"redhat-marketplace-44vn7\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.632138 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerDied","Data":"102be77cf24ba0c4aea728bc278d01371251fb365afd787f728b6cad6db2dc5b"} Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.632469 4546 generic.go:334] "Generic (PLEG): container finished" podID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerID="102be77cf24ba0c4aea728bc278d01371251fb365afd787f728b6cad6db2dc5b" exitCode=0 Feb 01 07:32:13 crc kubenswrapper[4546]: I0201 07:32:13.696823 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.208177 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.643044 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerStarted","Data":"53d24afaeda034553864e84272778e3a253ebde84f2d11e44c01eae7bcd81031"} Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.645111 4546 generic.go:334] "Generic (PLEG): container finished" podID="e29aeca6-3ee0-4533-9501-61597acc573a" containerID="c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995" exitCode=0 Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.645162 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerDied","Data":"c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995"} Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.645191 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerStarted","Data":"7b4907c13dcdc6d03476700d9c90ace3c21434262542f6df86d7d32b6c412bf2"} Feb 01 07:32:14 crc kubenswrapper[4546]: I0201 07:32:14.657063 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:32:14 crc kubenswrapper[4546]: E0201 07:32:14.657407 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:32:15 crc kubenswrapper[4546]: I0201 07:32:15.668928 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerStarted","Data":"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb"} Feb 01 07:32:16 crc kubenswrapper[4546]: I0201 07:32:16.690181 4546 generic.go:334] "Generic (PLEG): container finished" podID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerID="53d24afaeda034553864e84272778e3a253ebde84f2d11e44c01eae7bcd81031" exitCode=0 Feb 01 07:32:16 crc kubenswrapper[4546]: I0201 07:32:16.690508 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerDied","Data":"53d24afaeda034553864e84272778e3a253ebde84f2d11e44c01eae7bcd81031"} Feb 01 07:32:16 crc kubenswrapper[4546]: I0201 07:32:16.700081 4546 generic.go:334] "Generic (PLEG): container finished" podID="e29aeca6-3ee0-4533-9501-61597acc573a" containerID="f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb" exitCode=0 Feb 01 07:32:16 crc kubenswrapper[4546]: I0201 07:32:16.700102 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerDied","Data":"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb"} Feb 01 07:32:17 crc kubenswrapper[4546]: I0201 07:32:17.710474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerStarted","Data":"6022a14cc9a395cbfdaccbea3b93d0e43fbd2c04fae4e81c8e477d0695e57b9c"} Feb 01 07:32:17 crc kubenswrapper[4546]: I0201 07:32:17.716523 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerStarted","Data":"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0"} Feb 01 07:32:17 crc kubenswrapper[4546]: I0201 07:32:17.738442 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x29ns" podStartSLOduration=4.173236714 podStartE2EDuration="7.737087029s" podCreationTimestamp="2026-02-01 07:32:10 +0000 UTC" firstStartedPulling="2026-02-01 07:32:13.634034254 +0000 UTC m=+2964.284970271" lastFinishedPulling="2026-02-01 07:32:17.19788457 +0000 UTC m=+2967.848820586" observedRunningTime="2026-02-01 07:32:17.732608805 +0000 UTC m=+2968.383544821" watchObservedRunningTime="2026-02-01 07:32:17.737087029 +0000 UTC m=+2968.388023045" Feb 01 07:32:17 crc kubenswrapper[4546]: I0201 07:32:17.752336 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44vn7" podStartSLOduration=2.201006709 podStartE2EDuration="4.752319841s" podCreationTimestamp="2026-02-01 07:32:13 +0000 UTC" firstStartedPulling="2026-02-01 07:32:14.647407584 +0000 UTC m=+2965.298343600" lastFinishedPulling="2026-02-01 07:32:17.198720725 +0000 UTC m=+2967.849656732" observedRunningTime="2026-02-01 07:32:17.749549958 +0000 UTC m=+2968.400485974" watchObservedRunningTime="2026-02-01 07:32:17.752319841 +0000 UTC m=+2968.403255857" Feb 01 07:32:21 crc kubenswrapper[4546]: I0201 07:32:21.401323 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:21 crc kubenswrapper[4546]: I0201 07:32:21.402097 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:22 crc kubenswrapper[4546]: I0201 07:32:22.440382 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-x29ns" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="registry-server" probeResult="failure" output=< Feb 01 07:32:22 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:32:22 crc kubenswrapper[4546]: > Feb 01 07:32:23 crc kubenswrapper[4546]: I0201 07:32:23.697597 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:23 crc kubenswrapper[4546]: I0201 07:32:23.698072 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:23 crc kubenswrapper[4546]: I0201 07:32:23.742305 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:23 crc kubenswrapper[4546]: I0201 07:32:23.835497 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:23 crc kubenswrapper[4546]: I0201 07:32:23.979288 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:25 crc kubenswrapper[4546]: I0201 07:32:25.808924 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44vn7" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="registry-server" containerID="cri-o://a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0" gracePeriod=2 Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.621216 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.655678 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:32:26 crc kubenswrapper[4546]: E0201 07:32:26.656116 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.696471 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities\") pod \"e29aeca6-3ee0-4533-9501-61597acc573a\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.696565 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content\") pod \"e29aeca6-3ee0-4533-9501-61597acc573a\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.696725 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6jfp\" (UniqueName: \"kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp\") pod \"e29aeca6-3ee0-4533-9501-61597acc573a\" (UID: \"e29aeca6-3ee0-4533-9501-61597acc573a\") " Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.703429 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities" (OuterVolumeSpecName: "utilities") pod "e29aeca6-3ee0-4533-9501-61597acc573a" (UID: "e29aeca6-3ee0-4533-9501-61597acc573a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.734786 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp" (OuterVolumeSpecName: "kube-api-access-f6jfp") pod "e29aeca6-3ee0-4533-9501-61597acc573a" (UID: "e29aeca6-3ee0-4533-9501-61597acc573a"). InnerVolumeSpecName "kube-api-access-f6jfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.740778 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e29aeca6-3ee0-4533-9501-61597acc573a" (UID: "e29aeca6-3ee0-4533-9501-61597acc573a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.799219 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.799256 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e29aeca6-3ee0-4533-9501-61597acc573a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.799269 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6jfp\" (UniqueName: \"kubernetes.io/projected/e29aeca6-3ee0-4533-9501-61597acc573a-kube-api-access-f6jfp\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.818123 4546 generic.go:334] "Generic (PLEG): container finished" podID="e29aeca6-3ee0-4533-9501-61597acc573a" containerID="a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0" exitCode=0 Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.818176 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerDied","Data":"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0"} Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.818199 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44vn7" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.818220 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44vn7" event={"ID":"e29aeca6-3ee0-4533-9501-61597acc573a","Type":"ContainerDied","Data":"7b4907c13dcdc6d03476700d9c90ace3c21434262542f6df86d7d32b6c412bf2"} Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.818242 4546 scope.go:117] "RemoveContainer" containerID="a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.857527 4546 scope.go:117] "RemoveContainer" containerID="f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.860345 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.882662 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44vn7"] Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.883098 4546 scope.go:117] "RemoveContainer" containerID="c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.925072 4546 scope.go:117] "RemoveContainer" containerID="a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0" Feb 01 07:32:26 crc kubenswrapper[4546]: E0201 07:32:26.926378 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0\": container with ID starting with a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0 not found: ID does not exist" containerID="a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.926996 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0"} err="failed to get container status \"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0\": rpc error: code = NotFound desc = could not find container \"a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0\": container with ID starting with a8343d1b24ea47f29ca715df85ea17b54ff1fd43711a039c3127aa9ee8fdd7d0 not found: ID does not exist" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.927039 4546 scope.go:117] "RemoveContainer" containerID="f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb" Feb 01 07:32:26 crc kubenswrapper[4546]: E0201 07:32:26.927388 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb\": container with ID starting with f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb not found: ID does not exist" containerID="f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.927432 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb"} err="failed to get container status \"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb\": rpc error: code = NotFound desc = could not find container \"f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb\": container with ID starting with f02f0f8bb89c21010c7f136f5641752e90904188889117206cff8f1911a888fb not found: ID does not exist" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.927464 4546 scope.go:117] "RemoveContainer" containerID="c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995" Feb 01 07:32:26 crc kubenswrapper[4546]: E0201 07:32:26.927830 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995\": container with ID starting with c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995 not found: ID does not exist" containerID="c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995" Feb 01 07:32:26 crc kubenswrapper[4546]: I0201 07:32:26.927891 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995"} err="failed to get container status \"c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995\": rpc error: code = NotFound desc = could not find container \"c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995\": container with ID starting with c01aee49f0e292f74a5cabe6a1d52ddf0657c767fb8af5f05dca738ef6da7995 not found: ID does not exist" Feb 01 07:32:27 crc kubenswrapper[4546]: I0201 07:32:27.668758 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" path="/var/lib/kubelet/pods/e29aeca6-3ee0-4533-9501-61597acc573a/volumes" Feb 01 07:32:31 crc kubenswrapper[4546]: I0201 07:32:31.453243 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:31 crc kubenswrapper[4546]: I0201 07:32:31.522453 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:33 crc kubenswrapper[4546]: I0201 07:32:33.577305 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:33 crc kubenswrapper[4546]: I0201 07:32:33.578082 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x29ns" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="registry-server" containerID="cri-o://6022a14cc9a395cbfdaccbea3b93d0e43fbd2c04fae4e81c8e477d0695e57b9c" gracePeriod=2 Feb 01 07:32:33 crc kubenswrapper[4546]: I0201 07:32:33.916780 4546 generic.go:334] "Generic (PLEG): container finished" podID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerID="6022a14cc9a395cbfdaccbea3b93d0e43fbd2c04fae4e81c8e477d0695e57b9c" exitCode=0 Feb 01 07:32:33 crc kubenswrapper[4546]: I0201 07:32:33.916851 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerDied","Data":"6022a14cc9a395cbfdaccbea3b93d0e43fbd2c04fae4e81c8e477d0695e57b9c"} Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.208034 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.318088 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8vpq\" (UniqueName: \"kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq\") pod \"164d7a9c-7559-4fdb-a626-5ed784a738ca\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.318253 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities\") pod \"164d7a9c-7559-4fdb-a626-5ed784a738ca\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.318294 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content\") pod \"164d7a9c-7559-4fdb-a626-5ed784a738ca\" (UID: \"164d7a9c-7559-4fdb-a626-5ed784a738ca\") " Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.319974 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities" (OuterVolumeSpecName: "utilities") pod "164d7a9c-7559-4fdb-a626-5ed784a738ca" (UID: "164d7a9c-7559-4fdb-a626-5ed784a738ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.343395 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq" (OuterVolumeSpecName: "kube-api-access-m8vpq") pod "164d7a9c-7559-4fdb-a626-5ed784a738ca" (UID: "164d7a9c-7559-4fdb-a626-5ed784a738ca"). InnerVolumeSpecName "kube-api-access-m8vpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.402679 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "164d7a9c-7559-4fdb-a626-5ed784a738ca" (UID: "164d7a9c-7559-4fdb-a626-5ed784a738ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.422564 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8vpq\" (UniqueName: \"kubernetes.io/projected/164d7a9c-7559-4fdb-a626-5ed784a738ca-kube-api-access-m8vpq\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.422594 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.422608 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/164d7a9c-7559-4fdb-a626-5ed784a738ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.929087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x29ns" event={"ID":"164d7a9c-7559-4fdb-a626-5ed784a738ca","Type":"ContainerDied","Data":"ced74c15e149e15f3c8534efa587f335fc0bd41335213232c6ee6c94af638602"} Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.929164 4546 scope.go:117] "RemoveContainer" containerID="6022a14cc9a395cbfdaccbea3b93d0e43fbd2c04fae4e81c8e477d0695e57b9c" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.929347 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x29ns" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.963416 4546 scope.go:117] "RemoveContainer" containerID="53d24afaeda034553864e84272778e3a253ebde84f2d11e44c01eae7bcd81031" Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.973640 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.980927 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x29ns"] Feb 01 07:32:34 crc kubenswrapper[4546]: I0201 07:32:34.990673 4546 scope.go:117] "RemoveContainer" containerID="102be77cf24ba0c4aea728bc278d01371251fb365afd787f728b6cad6db2dc5b" Feb 01 07:32:35 crc kubenswrapper[4546]: I0201 07:32:35.664150 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" path="/var/lib/kubelet/pods/164d7a9c-7559-4fdb-a626-5ed784a738ca/volumes" Feb 01 07:32:37 crc kubenswrapper[4546]: I0201 07:32:37.655785 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:32:37 crc kubenswrapper[4546]: E0201 07:32:37.656619 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:32:50 crc kubenswrapper[4546]: I0201 07:32:50.656382 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:32:50 crc kubenswrapper[4546]: E0201 07:32:50.657159 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:33:02 crc kubenswrapper[4546]: I0201 07:33:02.654689 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:33:02 crc kubenswrapper[4546]: E0201 07:33:02.656919 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:33:02 crc kubenswrapper[4546]: E0201 07:33:02.819688 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:55330->192.168.26.196:40843: write tcp 192.168.26.196:55330->192.168.26.196:40843: write: connection reset by peer Feb 01 07:33:16 crc kubenswrapper[4546]: I0201 07:33:16.654627 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:33:16 crc kubenswrapper[4546]: E0201 07:33:16.656693 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:33:30 crc kubenswrapper[4546]: I0201 07:33:30.656365 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:33:30 crc kubenswrapper[4546]: E0201 07:33:30.657739 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:33:42 crc kubenswrapper[4546]: I0201 07:33:42.656832 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:33:42 crc kubenswrapper[4546]: E0201 07:33:42.659209 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:33:55 crc kubenswrapper[4546]: I0201 07:33:55.655505 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:33:55 crc kubenswrapper[4546]: E0201 07:33:55.656962 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:34:08 crc kubenswrapper[4546]: I0201 07:34:08.655322 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:34:08 crc kubenswrapper[4546]: E0201 07:34:08.656391 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:34:19 crc kubenswrapper[4546]: I0201 07:34:19.661747 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:34:19 crc kubenswrapper[4546]: E0201 07:34:19.663006 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:34:31 crc kubenswrapper[4546]: I0201 07:34:31.655398 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:34:31 crc kubenswrapper[4546]: E0201 07:34:31.656484 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:34:43 crc kubenswrapper[4546]: I0201 07:34:43.656449 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:34:43 crc kubenswrapper[4546]: E0201 07:34:43.657386 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:34:54 crc kubenswrapper[4546]: I0201 07:34:54.655841 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:34:54 crc kubenswrapper[4546]: E0201 07:34:54.656785 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:35:09 crc kubenswrapper[4546]: I0201 07:35:09.660119 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:35:09 crc kubenswrapper[4546]: E0201 07:35:09.661121 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:35:24 crc kubenswrapper[4546]: I0201 07:35:24.655036 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:35:24 crc kubenswrapper[4546]: E0201 07:35:24.655936 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:35:37 crc kubenswrapper[4546]: I0201 07:35:37.656567 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:35:37 crc kubenswrapper[4546]: E0201 07:35:37.657542 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:35:50 crc kubenswrapper[4546]: I0201 07:35:50.657231 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:35:50 crc kubenswrapper[4546]: E0201 07:35:50.658324 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:36:04 crc kubenswrapper[4546]: I0201 07:36:04.655560 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:36:04 crc kubenswrapper[4546]: I0201 07:36:04.925367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3"} Feb 01 07:37:20 crc kubenswrapper[4546]: E0201 07:37:20.842313 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:43066->192.168.26.196:40843: write tcp 192.168.26.196:43066->192.168.26.196:40843: write: broken pipe Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.271209 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283390 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283423 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283486 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283494 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283553 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="extract-content" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283561 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="extract-content" Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283580 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="extract-content" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283588 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="extract-content" Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283618 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="extract-utilities" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283626 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="extract-utilities" Feb 01 07:38:04 crc kubenswrapper[4546]: E0201 07:38:04.283647 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="extract-utilities" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.283654 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="extract-utilities" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.285024 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29aeca6-3ee0-4533-9501-61597acc573a" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.285087 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="164d7a9c-7559-4fdb-a626-5ed784a738ca" containerName="registry-server" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.293848 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.328421 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.389162 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbdg\" (UniqueName: \"kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.389443 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.389692 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.492892 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.493053 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.493401 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbdg\" (UniqueName: \"kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.495497 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.495803 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.518114 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbdg\" (UniqueName: \"kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg\") pod \"certified-operators-wqqnr\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:04 crc kubenswrapper[4546]: I0201 07:38:04.634266 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:05 crc kubenswrapper[4546]: I0201 07:38:05.435170 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:06 crc kubenswrapper[4546]: I0201 07:38:06.023839 4546 generic.go:334] "Generic (PLEG): container finished" podID="3f324678-0353-4d76-ad64-a33f48764155" containerID="f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474" exitCode=0 Feb 01 07:38:06 crc kubenswrapper[4546]: I0201 07:38:06.023936 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerDied","Data":"f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474"} Feb 01 07:38:06 crc kubenswrapper[4546]: I0201 07:38:06.024293 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerStarted","Data":"162012b2bdd854d4a7fe9a590f4dba7cf0c5554a7fc2760d8ade7f573f59fcc1"} Feb 01 07:38:06 crc kubenswrapper[4546]: I0201 07:38:06.028630 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:38:07 crc kubenswrapper[4546]: I0201 07:38:07.035176 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerStarted","Data":"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6"} Feb 01 07:38:09 crc kubenswrapper[4546]: I0201 07:38:09.077468 4546 generic.go:334] "Generic (PLEG): container finished" podID="3f324678-0353-4d76-ad64-a33f48764155" containerID="f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6" exitCode=0 Feb 01 07:38:09 crc kubenswrapper[4546]: I0201 07:38:09.079125 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerDied","Data":"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6"} Feb 01 07:38:10 crc kubenswrapper[4546]: I0201 07:38:10.095701 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerStarted","Data":"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032"} Feb 01 07:38:10 crc kubenswrapper[4546]: I0201 07:38:10.123060 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wqqnr" podStartSLOduration=2.552640498 podStartE2EDuration="6.12303594s" podCreationTimestamp="2026-02-01 07:38:04 +0000 UTC" firstStartedPulling="2026-02-01 07:38:06.026442777 +0000 UTC m=+3316.677378794" lastFinishedPulling="2026-02-01 07:38:09.59683822 +0000 UTC m=+3320.247774236" observedRunningTime="2026-02-01 07:38:10.118015032 +0000 UTC m=+3320.768951048" watchObservedRunningTime="2026-02-01 07:38:10.12303594 +0000 UTC m=+3320.773971946" Feb 01 07:38:14 crc kubenswrapper[4546]: I0201 07:38:14.634719 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:14 crc kubenswrapper[4546]: I0201 07:38:14.636447 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:14 crc kubenswrapper[4546]: I0201 07:38:14.679149 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:15 crc kubenswrapper[4546]: I0201 07:38:15.168818 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:15 crc kubenswrapper[4546]: I0201 07:38:15.211342 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.150097 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wqqnr" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="registry-server" containerID="cri-o://70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032" gracePeriod=2 Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.698003 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.843776 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content\") pod \"3f324678-0353-4d76-ad64-a33f48764155\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.844162 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwbdg\" (UniqueName: \"kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg\") pod \"3f324678-0353-4d76-ad64-a33f48764155\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.844242 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities\") pod \"3f324678-0353-4d76-ad64-a33f48764155\" (UID: \"3f324678-0353-4d76-ad64-a33f48764155\") " Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.846527 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities" (OuterVolumeSpecName: "utilities") pod "3f324678-0353-4d76-ad64-a33f48764155" (UID: "3f324678-0353-4d76-ad64-a33f48764155"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.856691 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg" (OuterVolumeSpecName: "kube-api-access-dwbdg") pod "3f324678-0353-4d76-ad64-a33f48764155" (UID: "3f324678-0353-4d76-ad64-a33f48764155"). InnerVolumeSpecName "kube-api-access-dwbdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.886402 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f324678-0353-4d76-ad64-a33f48764155" (UID: "3f324678-0353-4d76-ad64-a33f48764155"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.948211 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwbdg\" (UniqueName: \"kubernetes.io/projected/3f324678-0353-4d76-ad64-a33f48764155-kube-api-access-dwbdg\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.948262 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:17 crc kubenswrapper[4546]: I0201 07:38:17.948274 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f324678-0353-4d76-ad64-a33f48764155-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.160095 4546 generic.go:334] "Generic (PLEG): container finished" podID="3f324678-0353-4d76-ad64-a33f48764155" containerID="70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032" exitCode=0 Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.160144 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerDied","Data":"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032"} Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.160178 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wqqnr" event={"ID":"3f324678-0353-4d76-ad64-a33f48764155","Type":"ContainerDied","Data":"162012b2bdd854d4a7fe9a590f4dba7cf0c5554a7fc2760d8ade7f573f59fcc1"} Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.160199 4546 scope.go:117] "RemoveContainer" containerID="70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.160353 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wqqnr" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.189408 4546 scope.go:117] "RemoveContainer" containerID="f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.195536 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.211306 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wqqnr"] Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.213736 4546 scope.go:117] "RemoveContainer" containerID="f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.249406 4546 scope.go:117] "RemoveContainer" containerID="70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032" Feb 01 07:38:18 crc kubenswrapper[4546]: E0201 07:38:18.250459 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032\": container with ID starting with 70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032 not found: ID does not exist" containerID="70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.251412 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032"} err="failed to get container status \"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032\": rpc error: code = NotFound desc = could not find container \"70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032\": container with ID starting with 70dacf7062077dd902fae8f34adddfa0efe18ac998c0c4a426b2ee0be248f032 not found: ID does not exist" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.251463 4546 scope.go:117] "RemoveContainer" containerID="f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6" Feb 01 07:38:18 crc kubenswrapper[4546]: E0201 07:38:18.252177 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6\": container with ID starting with f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6 not found: ID does not exist" containerID="f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.252221 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6"} err="failed to get container status \"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6\": rpc error: code = NotFound desc = could not find container \"f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6\": container with ID starting with f659320307cd4f8bad20e7d896bdaff1e63a6170a2a2c78a2f7ac303b936c1e6 not found: ID does not exist" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.252238 4546 scope.go:117] "RemoveContainer" containerID="f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474" Feb 01 07:38:18 crc kubenswrapper[4546]: E0201 07:38:18.252666 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474\": container with ID starting with f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474 not found: ID does not exist" containerID="f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474" Feb 01 07:38:18 crc kubenswrapper[4546]: I0201 07:38:18.252687 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474"} err="failed to get container status \"f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474\": rpc error: code = NotFound desc = could not find container \"f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474\": container with ID starting with f3a4e8b2931f1f5b43949f868d92902a1d625b76b785f5219b39f0b6e7996474 not found: ID does not exist" Feb 01 07:38:19 crc kubenswrapper[4546]: I0201 07:38:19.664324 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f324678-0353-4d76-ad64-a33f48764155" path="/var/lib/kubelet/pods/3f324678-0353-4d76-ad64-a33f48764155/volumes" Feb 01 07:38:25 crc kubenswrapper[4546]: I0201 07:38:25.420310 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:38:25 crc kubenswrapper[4546]: I0201 07:38:25.421210 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:38:55 crc kubenswrapper[4546]: I0201 07:38:55.420602 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:38:55 crc kubenswrapper[4546]: I0201 07:38:55.421334 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.420915 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.421602 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.421679 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.422464 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.422541 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3" gracePeriod=600 Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.730494 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3" exitCode=0 Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.731000 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3"} Feb 01 07:39:25 crc kubenswrapper[4546]: I0201 07:39:25.731081 4546 scope.go:117] "RemoveContainer" containerID="38c6da4cb4a83480a6806eeda334f7e0d6565ab383f6d63f15f584a6e54dc008" Feb 01 07:39:26 crc kubenswrapper[4546]: I0201 07:39:26.742148 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a"} Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.865609 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:18 crc kubenswrapper[4546]: E0201 07:40:18.866916 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="extract-content" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.866935 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="extract-content" Feb 01 07:40:18 crc kubenswrapper[4546]: E0201 07:40:18.866969 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="extract-utilities" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.866977 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="extract-utilities" Feb 01 07:40:18 crc kubenswrapper[4546]: E0201 07:40:18.867012 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="registry-server" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.867019 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="registry-server" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.867321 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f324678-0353-4d76-ad64-a33f48764155" containerName="registry-server" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.873982 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.887990 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.919677 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.919748 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6p8\" (UniqueName: \"kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:18 crc kubenswrapper[4546]: I0201 07:40:18.920088 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.022307 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.022368 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6p8\" (UniqueName: \"kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.022500 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.022984 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.023212 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.039495 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6p8\" (UniqueName: \"kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8\") pod \"redhat-operators-4j5mz\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.203770 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:19 crc kubenswrapper[4546]: I0201 07:40:19.643120 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:20 crc kubenswrapper[4546]: I0201 07:40:20.279567 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c04eb75-d907-45ee-8677-3f1f74658917" containerID="8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55" exitCode=0 Feb 01 07:40:20 crc kubenswrapper[4546]: I0201 07:40:20.280306 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerDied","Data":"8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55"} Feb 01 07:40:20 crc kubenswrapper[4546]: I0201 07:40:20.280361 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerStarted","Data":"fc68372d633a97ed965276e567540835c5848570af91a4b9c3e6ac54fffee0ae"} Feb 01 07:40:21 crc kubenswrapper[4546]: I0201 07:40:21.298052 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerStarted","Data":"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c"} Feb 01 07:40:24 crc kubenswrapper[4546]: I0201 07:40:24.331048 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c04eb75-d907-45ee-8677-3f1f74658917" containerID="0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c" exitCode=0 Feb 01 07:40:24 crc kubenswrapper[4546]: I0201 07:40:24.331136 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerDied","Data":"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c"} Feb 01 07:40:25 crc kubenswrapper[4546]: I0201 07:40:25.343785 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerStarted","Data":"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8"} Feb 01 07:40:25 crc kubenswrapper[4546]: I0201 07:40:25.374614 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4j5mz" podStartSLOduration=2.8041140970000002 podStartE2EDuration="7.374589811s" podCreationTimestamp="2026-02-01 07:40:18 +0000 UTC" firstStartedPulling="2026-02-01 07:40:20.282740705 +0000 UTC m=+3450.933676721" lastFinishedPulling="2026-02-01 07:40:24.853216419 +0000 UTC m=+3455.504152435" observedRunningTime="2026-02-01 07:40:25.365249017 +0000 UTC m=+3456.016185033" watchObservedRunningTime="2026-02-01 07:40:25.374589811 +0000 UTC m=+3456.025525827" Feb 01 07:40:29 crc kubenswrapper[4546]: I0201 07:40:29.204683 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:29 crc kubenswrapper[4546]: I0201 07:40:29.205512 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:30 crc kubenswrapper[4546]: I0201 07:40:30.252742 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4j5mz" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" probeResult="failure" output=< Feb 01 07:40:30 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:40:30 crc kubenswrapper[4546]: > Feb 01 07:40:40 crc kubenswrapper[4546]: I0201 07:40:40.261091 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4j5mz" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" probeResult="failure" output=< Feb 01 07:40:40 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:40:40 crc kubenswrapper[4546]: > Feb 01 07:40:49 crc kubenswrapper[4546]: I0201 07:40:49.245707 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:49 crc kubenswrapper[4546]: I0201 07:40:49.289272 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:50 crc kubenswrapper[4546]: I0201 07:40:50.065242 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:50 crc kubenswrapper[4546]: I0201 07:40:50.585988 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4j5mz" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" containerID="cri-o://a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8" gracePeriod=2 Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.256958 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.336554 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content\") pod \"1c04eb75-d907-45ee-8677-3f1f74658917\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.336918 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f6p8\" (UniqueName: \"kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8\") pod \"1c04eb75-d907-45ee-8677-3f1f74658917\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.336989 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities\") pod \"1c04eb75-d907-45ee-8677-3f1f74658917\" (UID: \"1c04eb75-d907-45ee-8677-3f1f74658917\") " Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.337839 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities" (OuterVolumeSpecName: "utilities") pod "1c04eb75-d907-45ee-8677-3f1f74658917" (UID: "1c04eb75-d907-45ee-8677-3f1f74658917"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.354410 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8" (OuterVolumeSpecName: "kube-api-access-7f6p8") pod "1c04eb75-d907-45ee-8677-3f1f74658917" (UID: "1c04eb75-d907-45ee-8677-3f1f74658917"). InnerVolumeSpecName "kube-api-access-7f6p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.439505 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.439532 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f6p8\" (UniqueName: \"kubernetes.io/projected/1c04eb75-d907-45ee-8677-3f1f74658917-kube-api-access-7f6p8\") on node \"crc\" DevicePath \"\"" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.444795 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c04eb75-d907-45ee-8677-3f1f74658917" (UID: "1c04eb75-d907-45ee-8677-3f1f74658917"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.563730 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c04eb75-d907-45ee-8677-3f1f74658917-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.598916 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerDied","Data":"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8"} Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.599785 4546 scope.go:117] "RemoveContainer" containerID="a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.598811 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c04eb75-d907-45ee-8677-3f1f74658917" containerID="a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8" exitCode=0 Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.598930 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j5mz" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.600013 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j5mz" event={"ID":"1c04eb75-d907-45ee-8677-3f1f74658917","Type":"ContainerDied","Data":"fc68372d633a97ed965276e567540835c5848570af91a4b9c3e6ac54fffee0ae"} Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.627210 4546 scope.go:117] "RemoveContainer" containerID="0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.634336 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.643986 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4j5mz"] Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.658670 4546 scope.go:117] "RemoveContainer" containerID="8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.681541 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" path="/var/lib/kubelet/pods/1c04eb75-d907-45ee-8677-3f1f74658917/volumes" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.697050 4546 scope.go:117] "RemoveContainer" containerID="a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8" Feb 01 07:40:51 crc kubenswrapper[4546]: E0201 07:40:51.699997 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8\": container with ID starting with a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8 not found: ID does not exist" containerID="a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.700049 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8"} err="failed to get container status \"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8\": rpc error: code = NotFound desc = could not find container \"a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8\": container with ID starting with a35676f426304581f458207349b3dd12f0ba10216392e1bb2030cf66600b5ae8 not found: ID does not exist" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.700081 4546 scope.go:117] "RemoveContainer" containerID="0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c" Feb 01 07:40:51 crc kubenswrapper[4546]: E0201 07:40:51.701136 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c\": container with ID starting with 0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c not found: ID does not exist" containerID="0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.701199 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c"} err="failed to get container status \"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c\": rpc error: code = NotFound desc = could not find container \"0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c\": container with ID starting with 0e450b3b4a9f380ed21ca75c146c6a93ac9426c47358a8e03f8703c39749322c not found: ID does not exist" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.701238 4546 scope.go:117] "RemoveContainer" containerID="8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55" Feb 01 07:40:51 crc kubenswrapper[4546]: E0201 07:40:51.701922 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55\": container with ID starting with 8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55 not found: ID does not exist" containerID="8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55" Feb 01 07:40:51 crc kubenswrapper[4546]: I0201 07:40:51.701949 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55"} err="failed to get container status \"8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55\": rpc error: code = NotFound desc = could not find container \"8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55\": container with ID starting with 8ac735b8ec9d1843a453f0c82db8a5d2daf32a0e8fae65c41e92428f63defe55 not found: ID does not exist" Feb 01 07:41:25 crc kubenswrapper[4546]: I0201 07:41:25.420746 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:41:25 crc kubenswrapper[4546]: I0201 07:41:25.421417 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:41:55 crc kubenswrapper[4546]: I0201 07:41:55.421207 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:41:55 crc kubenswrapper[4546]: I0201 07:41:55.421721 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.421119 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.421819 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.421902 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.422786 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.423051 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" gracePeriod=600 Feb 01 07:42:25 crc kubenswrapper[4546]: E0201 07:42:25.549279 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.569089 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" exitCode=0 Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.569231 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a"} Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.569293 4546 scope.go:117] "RemoveContainer" containerID="fd80571e3a00d96b22a0e72b714f8fb88a6d1c5c3f68c4cca8e5706b7ef8baf3" Feb 01 07:42:25 crc kubenswrapper[4546]: I0201 07:42:25.577383 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:42:25 crc kubenswrapper[4546]: E0201 07:42:25.577794 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:42:39 crc kubenswrapper[4546]: I0201 07:42:39.661085 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:42:39 crc kubenswrapper[4546]: E0201 07:42:39.662340 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:42:50 crc kubenswrapper[4546]: I0201 07:42:50.655965 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:42:50 crc kubenswrapper[4546]: E0201 07:42:50.656845 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.030199 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:42:57 crc kubenswrapper[4546]: E0201 07:42:57.031151 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.031167 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" Feb 01 07:42:57 crc kubenswrapper[4546]: E0201 07:42:57.031181 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="extract-content" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.031186 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="extract-content" Feb 01 07:42:57 crc kubenswrapper[4546]: E0201 07:42:57.031195 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="extract-utilities" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.031201 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="extract-utilities" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.031419 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c04eb75-d907-45ee-8677-3f1f74658917" containerName="registry-server" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.032785 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.080487 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.095769 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.095894 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2thpt\" (UniqueName: \"kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.095979 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.198795 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.198903 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2thpt\" (UniqueName: \"kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.198962 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.199468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.199478 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.218431 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2thpt\" (UniqueName: \"kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt\") pod \"redhat-marketplace-cx947\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.377042 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.852964 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:42:57 crc kubenswrapper[4546]: I0201 07:42:57.922132 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerStarted","Data":"6c8c731368624179efa172965e53f2397c33852736c894b856373c3c7f26e46f"} Feb 01 07:42:58 crc kubenswrapper[4546]: I0201 07:42:58.933998 4546 generic.go:334] "Generic (PLEG): container finished" podID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerID="d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2" exitCode=0 Feb 01 07:42:58 crc kubenswrapper[4546]: I0201 07:42:58.934052 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerDied","Data":"d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2"} Feb 01 07:42:59 crc kubenswrapper[4546]: I0201 07:42:59.945398 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerStarted","Data":"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c"} Feb 01 07:43:00 crc kubenswrapper[4546]: I0201 07:43:00.956801 4546 generic.go:334] "Generic (PLEG): container finished" podID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerID="8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c" exitCode=0 Feb 01 07:43:00 crc kubenswrapper[4546]: I0201 07:43:00.956906 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerDied","Data":"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c"} Feb 01 07:43:01 crc kubenswrapper[4546]: I0201 07:43:01.967918 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerStarted","Data":"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44"} Feb 01 07:43:01 crc kubenswrapper[4546]: I0201 07:43:01.991514 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cx947" podStartSLOduration=2.474962366 podStartE2EDuration="4.991485546s" podCreationTimestamp="2026-02-01 07:42:57 +0000 UTC" firstStartedPulling="2026-02-01 07:42:58.936209643 +0000 UTC m=+3609.587145659" lastFinishedPulling="2026-02-01 07:43:01.452732823 +0000 UTC m=+3612.103668839" observedRunningTime="2026-02-01 07:43:01.98832716 +0000 UTC m=+3612.639263167" watchObservedRunningTime="2026-02-01 07:43:01.991485546 +0000 UTC m=+3612.642421562" Feb 01 07:43:03 crc kubenswrapper[4546]: I0201 07:43:03.657313 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:43:03 crc kubenswrapper[4546]: E0201 07:43:03.657795 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:43:07 crc kubenswrapper[4546]: I0201 07:43:07.377441 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:07 crc kubenswrapper[4546]: I0201 07:43:07.377895 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:07 crc kubenswrapper[4546]: I0201 07:43:07.417199 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:08 crc kubenswrapper[4546]: I0201 07:43:08.071651 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:08 crc kubenswrapper[4546]: I0201 07:43:08.130601 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.056543 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cx947" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="registry-server" containerID="cri-o://618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44" gracePeriod=2 Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.505396 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.609457 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content\") pod \"a425e0d6-b9a2-47c3-973a-68eae78650db\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.609899 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities\") pod \"a425e0d6-b9a2-47c3-973a-68eae78650db\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.610009 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2thpt\" (UniqueName: \"kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt\") pod \"a425e0d6-b9a2-47c3-973a-68eae78650db\" (UID: \"a425e0d6-b9a2-47c3-973a-68eae78650db\") " Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.612195 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities" (OuterVolumeSpecName: "utilities") pod "a425e0d6-b9a2-47c3-973a-68eae78650db" (UID: "a425e0d6-b9a2-47c3-973a-68eae78650db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.613693 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.621550 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt" (OuterVolumeSpecName: "kube-api-access-2thpt") pod "a425e0d6-b9a2-47c3-973a-68eae78650db" (UID: "a425e0d6-b9a2-47c3-973a-68eae78650db"). InnerVolumeSpecName "kube-api-access-2thpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.630107 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a425e0d6-b9a2-47c3-973a-68eae78650db" (UID: "a425e0d6-b9a2-47c3-973a-68eae78650db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.715136 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2thpt\" (UniqueName: \"kubernetes.io/projected/a425e0d6-b9a2-47c3-973a-68eae78650db-kube-api-access-2thpt\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:10 crc kubenswrapper[4546]: I0201 07:43:10.715533 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a425e0d6-b9a2-47c3-973a-68eae78650db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.070169 4546 generic.go:334] "Generic (PLEG): container finished" podID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerID="618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44" exitCode=0 Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.070218 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerDied","Data":"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44"} Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.070270 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cx947" event={"ID":"a425e0d6-b9a2-47c3-973a-68eae78650db","Type":"ContainerDied","Data":"6c8c731368624179efa172965e53f2397c33852736c894b856373c3c7f26e46f"} Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.070294 4546 scope.go:117] "RemoveContainer" containerID="618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.070438 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cx947" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.114792 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.126417 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cx947"] Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.127409 4546 scope.go:117] "RemoveContainer" containerID="8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.155454 4546 scope.go:117] "RemoveContainer" containerID="d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.196816 4546 scope.go:117] "RemoveContainer" containerID="618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44" Feb 01 07:43:11 crc kubenswrapper[4546]: E0201 07:43:11.197249 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44\": container with ID starting with 618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44 not found: ID does not exist" containerID="618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.197296 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44"} err="failed to get container status \"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44\": rpc error: code = NotFound desc = could not find container \"618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44\": container with ID starting with 618a16235ac993d547bf44fad1383f860c8f4e5770a7a030b975271b5e046d44 not found: ID does not exist" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.197336 4546 scope.go:117] "RemoveContainer" containerID="8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c" Feb 01 07:43:11 crc kubenswrapper[4546]: E0201 07:43:11.197672 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c\": container with ID starting with 8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c not found: ID does not exist" containerID="8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.197711 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c"} err="failed to get container status \"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c\": rpc error: code = NotFound desc = could not find container \"8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c\": container with ID starting with 8fe2b01df473671a1dfbf758948dfe30e0f70161e3d39acdf6ab8d74a496069c not found: ID does not exist" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.197737 4546 scope.go:117] "RemoveContainer" containerID="d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2" Feb 01 07:43:11 crc kubenswrapper[4546]: E0201 07:43:11.198093 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2\": container with ID starting with d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2 not found: ID does not exist" containerID="d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.198115 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2"} err="failed to get container status \"d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2\": rpc error: code = NotFound desc = could not find container \"d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2\": container with ID starting with d457add7ab246637e7954f0b82d1b5d8417589e4df4b206ba9b4374c9e58cee2 not found: ID does not exist" Feb 01 07:43:11 crc kubenswrapper[4546]: I0201 07:43:11.666163 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" path="/var/lib/kubelet/pods/a425e0d6-b9a2-47c3-973a-68eae78650db/volumes" Feb 01 07:43:16 crc kubenswrapper[4546]: I0201 07:43:16.655381 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:43:16 crc kubenswrapper[4546]: E0201 07:43:16.656371 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:43:27 crc kubenswrapper[4546]: I0201 07:43:27.655542 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:43:27 crc kubenswrapper[4546]: E0201 07:43:27.656487 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:43:42 crc kubenswrapper[4546]: I0201 07:43:42.655437 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:43:42 crc kubenswrapper[4546]: E0201 07:43:42.656383 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:43:54 crc kubenswrapper[4546]: I0201 07:43:54.656754 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:43:54 crc kubenswrapper[4546]: E0201 07:43:54.657993 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:44:09 crc kubenswrapper[4546]: I0201 07:44:09.662682 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:44:09 crc kubenswrapper[4546]: E0201 07:44:09.663794 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:44:23 crc kubenswrapper[4546]: I0201 07:44:23.654968 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:44:23 crc kubenswrapper[4546]: E0201 07:44:23.656202 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:44:34 crc kubenswrapper[4546]: I0201 07:44:34.655103 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:44:34 crc kubenswrapper[4546]: E0201 07:44:34.656030 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:44:45 crc kubenswrapper[4546]: I0201 07:44:45.655198 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:44:45 crc kubenswrapper[4546]: E0201 07:44:45.656410 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:44:56 crc kubenswrapper[4546]: I0201 07:44:56.654772 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:44:56 crc kubenswrapper[4546]: E0201 07:44:56.657104 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.176926 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8"] Feb 01 07:45:00 crc kubenswrapper[4546]: E0201 07:45:00.180203 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="extract-content" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.180222 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="extract-content" Feb 01 07:45:00 crc kubenswrapper[4546]: E0201 07:45:00.180270 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.180277 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[4546]: E0201 07:45:00.180291 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="extract-utilities" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.180299 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="extract-utilities" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.181260 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a425e0d6-b9a2-47c3-973a-68eae78650db" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.182578 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.193981 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.194168 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.196601 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8"] Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.285099 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.285170 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rss\" (UniqueName: \"kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.285236 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.388042 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.388314 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.388412 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rss\" (UniqueName: \"kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.389086 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.395578 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.405370 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rss\" (UniqueName: \"kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss\") pod \"collect-profiles-29498865-2pjh8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.508373 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:00 crc kubenswrapper[4546]: I0201 07:45:00.974314 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8"] Feb 01 07:45:01 crc kubenswrapper[4546]: I0201 07:45:01.137607 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" event={"ID":"abe2eed6-dd1b-4865-8ee7-1c675edda8c8","Type":"ContainerStarted","Data":"483b4285388fa5265cef1048c28a1c54297a500a6341411692820a378c923b92"} Feb 01 07:45:01 crc kubenswrapper[4546]: I0201 07:45:01.137840 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" event={"ID":"abe2eed6-dd1b-4865-8ee7-1c675edda8c8","Type":"ContainerStarted","Data":"2304b35b66aa3ec5881e8e22b88372d562a03318df36be1a9095d4e30d31ce48"} Feb 01 07:45:01 crc kubenswrapper[4546]: I0201 07:45:01.155238 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" podStartSLOduration=1.155217836 podStartE2EDuration="1.155217836s" podCreationTimestamp="2026-02-01 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:45:01.150547579 +0000 UTC m=+3731.801483595" watchObservedRunningTime="2026-02-01 07:45:01.155217836 +0000 UTC m=+3731.806153851" Feb 01 07:45:02 crc kubenswrapper[4546]: I0201 07:45:02.150075 4546 generic.go:334] "Generic (PLEG): container finished" podID="abe2eed6-dd1b-4865-8ee7-1c675edda8c8" containerID="483b4285388fa5265cef1048c28a1c54297a500a6341411692820a378c923b92" exitCode=0 Feb 01 07:45:02 crc kubenswrapper[4546]: I0201 07:45:02.150174 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" event={"ID":"abe2eed6-dd1b-4865-8ee7-1c675edda8c8","Type":"ContainerDied","Data":"483b4285388fa5265cef1048c28a1c54297a500a6341411692820a378c923b92"} Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.467752 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.557820 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume\") pod \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.557958 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume\") pod \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.558118 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rss\" (UniqueName: \"kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss\") pod \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\" (UID: \"abe2eed6-dd1b-4865-8ee7-1c675edda8c8\") " Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.558667 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "abe2eed6-dd1b-4865-8ee7-1c675edda8c8" (UID: "abe2eed6-dd1b-4865-8ee7-1c675edda8c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.559178 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.566313 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "abe2eed6-dd1b-4865-8ee7-1c675edda8c8" (UID: "abe2eed6-dd1b-4865-8ee7-1c675edda8c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.566482 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss" (OuterVolumeSpecName: "kube-api-access-94rss") pod "abe2eed6-dd1b-4865-8ee7-1c675edda8c8" (UID: "abe2eed6-dd1b-4865-8ee7-1c675edda8c8"). InnerVolumeSpecName "kube-api-access-94rss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.662393 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rss\" (UniqueName: \"kubernetes.io/projected/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-kube-api-access-94rss\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:03 crc kubenswrapper[4546]: I0201 07:45:03.662444 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/abe2eed6-dd1b-4865-8ee7-1c675edda8c8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:04 crc kubenswrapper[4546]: I0201 07:45:04.170849 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" event={"ID":"abe2eed6-dd1b-4865-8ee7-1c675edda8c8","Type":"ContainerDied","Data":"2304b35b66aa3ec5881e8e22b88372d562a03318df36be1a9095d4e30d31ce48"} Feb 01 07:45:04 crc kubenswrapper[4546]: I0201 07:45:04.171252 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2304b35b66aa3ec5881e8e22b88372d562a03318df36be1a9095d4e30d31ce48" Feb 01 07:45:04 crc kubenswrapper[4546]: I0201 07:45:04.170979 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8" Feb 01 07:45:04 crc kubenswrapper[4546]: I0201 07:45:04.561456 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq"] Feb 01 07:45:04 crc kubenswrapper[4546]: I0201 07:45:04.569772 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-982tq"] Feb 01 07:45:05 crc kubenswrapper[4546]: I0201 07:45:05.666601 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b38979b-e35d-4fa3-a515-1e91fb6bf310" path="/var/lib/kubelet/pods/6b38979b-e35d-4fa3-a515-1e91fb6bf310/volumes" Feb 01 07:45:10 crc kubenswrapper[4546]: I0201 07:45:10.655486 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:45:10 crc kubenswrapper[4546]: E0201 07:45:10.656592 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:45:21 crc kubenswrapper[4546]: I0201 07:45:21.655091 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:45:21 crc kubenswrapper[4546]: E0201 07:45:21.656113 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:45:36 crc kubenswrapper[4546]: I0201 07:45:36.655262 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:45:36 crc kubenswrapper[4546]: E0201 07:45:36.656363 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:45:48 crc kubenswrapper[4546]: I0201 07:45:48.655803 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:45:48 crc kubenswrapper[4546]: E0201 07:45:48.656912 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:45:58 crc kubenswrapper[4546]: I0201 07:45:58.183084 4546 scope.go:117] "RemoveContainer" containerID="3923f276a66accbb6ef12ed7810189d9749e2378bbbc2af011b77abe62099391" Feb 01 07:46:02 crc kubenswrapper[4546]: I0201 07:46:02.655363 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:02 crc kubenswrapper[4546]: E0201 07:46:02.656541 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:46:14 crc kubenswrapper[4546]: I0201 07:46:14.654691 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:14 crc kubenswrapper[4546]: E0201 07:46:14.656772 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:46:25 crc kubenswrapper[4546]: I0201 07:46:25.655932 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:25 crc kubenswrapper[4546]: E0201 07:46:25.656813 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:46:36 crc kubenswrapper[4546]: I0201 07:46:36.654494 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:36 crc kubenswrapper[4546]: E0201 07:46:36.655408 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:46:48 crc kubenswrapper[4546]: I0201 07:46:48.655718 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:48 crc kubenswrapper[4546]: E0201 07:46:48.657429 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:46:59 crc kubenswrapper[4546]: I0201 07:46:59.659460 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:46:59 crc kubenswrapper[4546]: E0201 07:46:59.660283 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.274324 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zg7k6"] Feb 01 07:47:02 crc kubenswrapper[4546]: E0201 07:47:02.275270 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe2eed6-dd1b-4865-8ee7-1c675edda8c8" containerName="collect-profiles" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.275284 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe2eed6-dd1b-4865-8ee7-1c675edda8c8" containerName="collect-profiles" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.275482 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe2eed6-dd1b-4865-8ee7-1c675edda8c8" containerName="collect-profiles" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.276901 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.296398 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zg7k6"] Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.409607 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-utilities\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.409898 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-catalog-content\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.410072 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkssp\" (UniqueName: \"kubernetes.io/projected/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-kube-api-access-dkssp\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.512145 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-utilities\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.512307 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-catalog-content\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.512632 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-utilities\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.512681 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-catalog-content\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.512742 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkssp\" (UniqueName: \"kubernetes.io/projected/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-kube-api-access-dkssp\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.539688 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkssp\" (UniqueName: \"kubernetes.io/projected/a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a-kube-api-access-dkssp\") pod \"community-operators-zg7k6\" (UID: \"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a\") " pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:02 crc kubenswrapper[4546]: I0201 07:47:02.595406 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:03 crc kubenswrapper[4546]: I0201 07:47:03.165461 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zg7k6"] Feb 01 07:47:03 crc kubenswrapper[4546]: I0201 07:47:03.317395 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zg7k6" event={"ID":"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a","Type":"ContainerStarted","Data":"81fbd4a105046c87938760ff39eb48173eeef692319eee088cae1b76502d7763"} Feb 01 07:47:04 crc kubenswrapper[4546]: I0201 07:47:04.326531 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a" containerID="687cb9271017bbe3ae8bdb18f9985124ec2cc28b81c1d0cc0e3ba942c116cc1d" exitCode=0 Feb 01 07:47:04 crc kubenswrapper[4546]: I0201 07:47:04.326617 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zg7k6" event={"ID":"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a","Type":"ContainerDied","Data":"687cb9271017bbe3ae8bdb18f9985124ec2cc28b81c1d0cc0e3ba942c116cc1d"} Feb 01 07:47:04 crc kubenswrapper[4546]: I0201 07:47:04.328961 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:47:10 crc kubenswrapper[4546]: I0201 07:47:10.385116 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a" containerID="e91ab3f8689ddb480c9265f3bebd769b101c916b5da262f5a5f592f931432473" exitCode=0 Feb 01 07:47:10 crc kubenswrapper[4546]: I0201 07:47:10.385174 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zg7k6" event={"ID":"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a","Type":"ContainerDied","Data":"e91ab3f8689ddb480c9265f3bebd769b101c916b5da262f5a5f592f931432473"} Feb 01 07:47:10 crc kubenswrapper[4546]: I0201 07:47:10.656811 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:47:10 crc kubenswrapper[4546]: E0201 07:47:10.657312 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:47:11 crc kubenswrapper[4546]: I0201 07:47:11.400378 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zg7k6" event={"ID":"a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a","Type":"ContainerStarted","Data":"031e63e94240ab205ad4e5b4cd56af9475c688df2fe7dbf2e02f3cba41f98102"} Feb 01 07:47:11 crc kubenswrapper[4546]: I0201 07:47:11.426274 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zg7k6" podStartSLOduration=2.734543854 podStartE2EDuration="9.426254125s" podCreationTimestamp="2026-02-01 07:47:02 +0000 UTC" firstStartedPulling="2026-02-01 07:47:04.328682389 +0000 UTC m=+3854.979618394" lastFinishedPulling="2026-02-01 07:47:11.020392649 +0000 UTC m=+3861.671328665" observedRunningTime="2026-02-01 07:47:11.425205157 +0000 UTC m=+3862.076141172" watchObservedRunningTime="2026-02-01 07:47:11.426254125 +0000 UTC m=+3862.077190141" Feb 01 07:47:12 crc kubenswrapper[4546]: I0201 07:47:12.596875 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:12 crc kubenswrapper[4546]: I0201 07:47:12.596923 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:13 crc kubenswrapper[4546]: I0201 07:47:13.637569 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zg7k6" podUID="a4dcaeed-5c90-4178-b9ee-d70f3ff5a98a" containerName="registry-server" probeResult="failure" output=< Feb 01 07:47:13 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:47:13 crc kubenswrapper[4546]: > Feb 01 07:47:21 crc kubenswrapper[4546]: I0201 07:47:21.654631 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:47:21 crc kubenswrapper[4546]: E0201 07:47:21.655714 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:47:22 crc kubenswrapper[4546]: I0201 07:47:22.645391 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:22 crc kubenswrapper[4546]: I0201 07:47:22.703215 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zg7k6" Feb 01 07:47:22 crc kubenswrapper[4546]: I0201 07:47:22.786789 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zg7k6"] Feb 01 07:47:22 crc kubenswrapper[4546]: I0201 07:47:22.894089 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 07:47:22 crc kubenswrapper[4546]: I0201 07:47:22.894978 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnzzr" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="registry-server" containerID="cri-o://d93dd4495b462494bd1f3a6449b44437eb3dcfb6bf0f5ff3f79d62e8c715597b" gracePeriod=2 Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.535575 4546 generic.go:334] "Generic (PLEG): container finished" podID="075f574e-3456-4491-884c-6893c8b86ca2" containerID="d93dd4495b462494bd1f3a6449b44437eb3dcfb6bf0f5ff3f79d62e8c715597b" exitCode=0 Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.535853 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerDied","Data":"d93dd4495b462494bd1f3a6449b44437eb3dcfb6bf0f5ff3f79d62e8c715597b"} Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.654927 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.766749 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9248\" (UniqueName: \"kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248\") pod \"075f574e-3456-4491-884c-6893c8b86ca2\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.766904 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities\") pod \"075f574e-3456-4491-884c-6893c8b86ca2\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.766967 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content\") pod \"075f574e-3456-4491-884c-6893c8b86ca2\" (UID: \"075f574e-3456-4491-884c-6893c8b86ca2\") " Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.768386 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities" (OuterVolumeSpecName: "utilities") pod "075f574e-3456-4491-884c-6893c8b86ca2" (UID: "075f574e-3456-4491-884c-6893c8b86ca2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.789398 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248" (OuterVolumeSpecName: "kube-api-access-x9248") pod "075f574e-3456-4491-884c-6893c8b86ca2" (UID: "075f574e-3456-4491-884c-6893c8b86ca2"). InnerVolumeSpecName "kube-api-access-x9248". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.826048 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "075f574e-3456-4491-884c-6893c8b86ca2" (UID: "075f574e-3456-4491-884c-6893c8b86ca2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.870083 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9248\" (UniqueName: \"kubernetes.io/projected/075f574e-3456-4491-884c-6893c8b86ca2-kube-api-access-x9248\") on node \"crc\" DevicePath \"\"" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.870117 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:47:23 crc kubenswrapper[4546]: I0201 07:47:23.870126 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/075f574e-3456-4491-884c-6893c8b86ca2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.545295 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnzzr" event={"ID":"075f574e-3456-4491-884c-6893c8b86ca2","Type":"ContainerDied","Data":"d427021b7b5cba33fec3efa776928931fd31c7aeb81432741c47da5665b45b02"} Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.545550 4546 scope.go:117] "RemoveContainer" containerID="d93dd4495b462494bd1f3a6449b44437eb3dcfb6bf0f5ff3f79d62e8c715597b" Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.545331 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnzzr" Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.570136 4546 scope.go:117] "RemoveContainer" containerID="cd8076f7a1375be45031f91c0d201b0a781c3de49488412b77454442083a232d" Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.589690 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.594919 4546 scope.go:117] "RemoveContainer" containerID="e1c7014f21512c975188aa57f7af8972ae575c351235cbc6a29f200c2f116714" Feb 01 07:47:24 crc kubenswrapper[4546]: I0201 07:47:24.599608 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnzzr"] Feb 01 07:47:25 crc kubenswrapper[4546]: I0201 07:47:25.667836 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075f574e-3456-4491-884c-6893c8b86ca2" path="/var/lib/kubelet/pods/075f574e-3456-4491-884c-6893c8b86ca2/volumes" Feb 01 07:47:32 crc kubenswrapper[4546]: I0201 07:47:32.654836 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:47:33 crc kubenswrapper[4546]: I0201 07:47:33.636564 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0"} Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.378375 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:06 crc kubenswrapper[4546]: E0201 07:49:06.382581 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="extract-content" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.382617 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="extract-content" Feb 01 07:49:06 crc kubenswrapper[4546]: E0201 07:49:06.382646 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="extract-utilities" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.382654 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="extract-utilities" Feb 01 07:49:06 crc kubenswrapper[4546]: E0201 07:49:06.382673 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="registry-server" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.382680 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="registry-server" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.383098 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="075f574e-3456-4491-884c-6893c8b86ca2" containerName="registry-server" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.387452 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.417133 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.417199 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.417227 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwsq\" (UniqueName: \"kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.423185 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.520313 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.520367 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.520394 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwsq\" (UniqueName: \"kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.521060 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.521089 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.541542 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwsq\" (UniqueName: \"kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq\") pod \"certified-operators-54d7f\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:06 crc kubenswrapper[4546]: I0201 07:49:06.718364 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:07 crc kubenswrapper[4546]: I0201 07:49:07.264367 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:07 crc kubenswrapper[4546]: I0201 07:49:07.494848 4546 generic.go:334] "Generic (PLEG): container finished" podID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerID="933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f" exitCode=0 Feb 01 07:49:07 crc kubenswrapper[4546]: I0201 07:49:07.495029 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerDied","Data":"933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f"} Feb 01 07:49:07 crc kubenswrapper[4546]: I0201 07:49:07.495127 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerStarted","Data":"65d3385c2b97ff00ce204503e521f66cf48a85bcb0ae87e6465082f67e1a99d3"} Feb 01 07:49:08 crc kubenswrapper[4546]: I0201 07:49:08.510209 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerStarted","Data":"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd"} Feb 01 07:49:10 crc kubenswrapper[4546]: I0201 07:49:10.530422 4546 generic.go:334] "Generic (PLEG): container finished" podID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerID="0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd" exitCode=0 Feb 01 07:49:10 crc kubenswrapper[4546]: I0201 07:49:10.530498 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerDied","Data":"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd"} Feb 01 07:49:11 crc kubenswrapper[4546]: I0201 07:49:11.542360 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerStarted","Data":"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69"} Feb 01 07:49:11 crc kubenswrapper[4546]: I0201 07:49:11.568392 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54d7f" podStartSLOduration=1.9019751839999999 podStartE2EDuration="5.568368494s" podCreationTimestamp="2026-02-01 07:49:06 +0000 UTC" firstStartedPulling="2026-02-01 07:49:07.496740278 +0000 UTC m=+3978.147676293" lastFinishedPulling="2026-02-01 07:49:11.163133597 +0000 UTC m=+3981.814069603" observedRunningTime="2026-02-01 07:49:11.56530695 +0000 UTC m=+3982.216242967" watchObservedRunningTime="2026-02-01 07:49:11.568368494 +0000 UTC m=+3982.219304509" Feb 01 07:49:16 crc kubenswrapper[4546]: I0201 07:49:16.719407 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:16 crc kubenswrapper[4546]: I0201 07:49:16.720166 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:16 crc kubenswrapper[4546]: I0201 07:49:16.760594 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:17 crc kubenswrapper[4546]: I0201 07:49:17.635591 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:19 crc kubenswrapper[4546]: I0201 07:49:19.531258 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:19 crc kubenswrapper[4546]: I0201 07:49:19.614595 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54d7f" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="registry-server" containerID="cri-o://0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69" gracePeriod=2 Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.557138 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.626559 4546 generic.go:334] "Generic (PLEG): container finished" podID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerID="0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69" exitCode=0 Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.626600 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerDied","Data":"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69"} Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.626634 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54d7f" event={"ID":"2a419773-cf41-421c-996e-ae9cbb1ea4d8","Type":"ContainerDied","Data":"65d3385c2b97ff00ce204503e521f66cf48a85bcb0ae87e6465082f67e1a99d3"} Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.626631 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54d7f" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.627241 4546 scope.go:117] "RemoveContainer" containerID="0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.652468 4546 scope.go:117] "RemoveContainer" containerID="0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.672707 4546 scope.go:117] "RemoveContainer" containerID="933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.718246 4546 scope.go:117] "RemoveContainer" containerID="0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69" Feb 01 07:49:20 crc kubenswrapper[4546]: E0201 07:49:20.720126 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69\": container with ID starting with 0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69 not found: ID does not exist" containerID="0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.720757 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69"} err="failed to get container status \"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69\": rpc error: code = NotFound desc = could not find container \"0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69\": container with ID starting with 0fa0ec6ee4bb778a2997b93faf464f82aed78f8291658748345406b3b1164e69 not found: ID does not exist" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.720829 4546 scope.go:117] "RemoveContainer" containerID="0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd" Feb 01 07:49:20 crc kubenswrapper[4546]: E0201 07:49:20.721277 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd\": container with ID starting with 0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd not found: ID does not exist" containerID="0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.721318 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd"} err="failed to get container status \"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd\": rpc error: code = NotFound desc = could not find container \"0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd\": container with ID starting with 0e355b4c417df326ba2b57e8cb5fac4f2bbc59a56b39425466f2f128660746cd not found: ID does not exist" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.721346 4546 scope.go:117] "RemoveContainer" containerID="933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f" Feb 01 07:49:20 crc kubenswrapper[4546]: E0201 07:49:20.721939 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f\": container with ID starting with 933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f not found: ID does not exist" containerID="933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.721989 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f"} err="failed to get container status \"933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f\": rpc error: code = NotFound desc = could not find container \"933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f\": container with ID starting with 933ba5b73406123480974ef05a217ef13028a19a51681a3eef8c59b9a033690f not found: ID does not exist" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.725188 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content\") pod \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.725455 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jwsq\" (UniqueName: \"kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq\") pod \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.725687 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities\") pod \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\" (UID: \"2a419773-cf41-421c-996e-ae9cbb1ea4d8\") " Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.726261 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities" (OuterVolumeSpecName: "utilities") pod "2a419773-cf41-421c-996e-ae9cbb1ea4d8" (UID: "2a419773-cf41-421c-996e-ae9cbb1ea4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.734431 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq" (OuterVolumeSpecName: "kube-api-access-7jwsq") pod "2a419773-cf41-421c-996e-ae9cbb1ea4d8" (UID: "2a419773-cf41-421c-996e-ae9cbb1ea4d8"). InnerVolumeSpecName "kube-api-access-7jwsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.756788 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a419773-cf41-421c-996e-ae9cbb1ea4d8" (UID: "2a419773-cf41-421c-996e-ae9cbb1ea4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.829048 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.829081 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jwsq\" (UniqueName: \"kubernetes.io/projected/2a419773-cf41-421c-996e-ae9cbb1ea4d8-kube-api-access-7jwsq\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.829096 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a419773-cf41-421c-996e-ae9cbb1ea4d8-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.966501 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:20 crc kubenswrapper[4546]: I0201 07:49:20.975681 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54d7f"] Feb 01 07:49:21 crc kubenswrapper[4546]: I0201 07:49:21.668774 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" path="/var/lib/kubelet/pods/2a419773-cf41-421c-996e-ae9cbb1ea4d8/volumes" Feb 01 07:49:55 crc kubenswrapper[4546]: I0201 07:49:55.420988 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:49:55 crc kubenswrapper[4546]: I0201 07:49:55.422468 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:50:25 crc kubenswrapper[4546]: I0201 07:50:25.420776 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:50:25 crc kubenswrapper[4546]: I0201 07:50:25.421299 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:50:55 crc kubenswrapper[4546]: I0201 07:50:55.420306 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:50:55 crc kubenswrapper[4546]: I0201 07:50:55.420974 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:50:55 crc kubenswrapper[4546]: I0201 07:50:55.421022 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:50:55 crc kubenswrapper[4546]: I0201 07:50:55.421806 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:50:55 crc kubenswrapper[4546]: I0201 07:50:55.421889 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0" gracePeriod=600 Feb 01 07:50:56 crc kubenswrapper[4546]: I0201 07:50:56.447503 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0" exitCode=0 Feb 01 07:50:56 crc kubenswrapper[4546]: I0201 07:50:56.447585 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0"} Feb 01 07:50:56 crc kubenswrapper[4546]: I0201 07:50:56.448107 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2"} Feb 01 07:50:56 crc kubenswrapper[4546]: I0201 07:50:56.448134 4546 scope.go:117] "RemoveContainer" containerID="28c24e7c33e1a6f4a5b672c5b45abb797ca1b2ac4b3d2607faccd5f92d65376a" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.093151 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:03 crc kubenswrapper[4546]: E0201 07:52:03.094232 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="registry-server" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.094250 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="registry-server" Feb 01 07:52:03 crc kubenswrapper[4546]: E0201 07:52:03.094261 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="extract-utilities" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.094267 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="extract-utilities" Feb 01 07:52:03 crc kubenswrapper[4546]: E0201 07:52:03.094291 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="extract-content" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.094297 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="extract-content" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.094529 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a419773-cf41-421c-996e-ae9cbb1ea4d8" containerName="registry-server" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.096698 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.106873 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.143518 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.143701 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4z8f\" (UniqueName: \"kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.143746 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.245520 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4z8f\" (UniqueName: \"kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.245580 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.245695 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.246103 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.246131 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.265607 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4z8f\" (UniqueName: \"kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f\") pod \"redhat-operators-ng5dz\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:03 crc kubenswrapper[4546]: I0201 07:52:03.418323 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:04 crc kubenswrapper[4546]: I0201 07:52:04.087076 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:05 crc kubenswrapper[4546]: I0201 07:52:05.024701 4546 generic.go:334] "Generic (PLEG): container finished" podID="f0156d9c-3699-413e-8903-8370eda3156f" containerID="eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1" exitCode=0 Feb 01 07:52:05 crc kubenswrapper[4546]: I0201 07:52:05.024763 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerDied","Data":"eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1"} Feb 01 07:52:05 crc kubenswrapper[4546]: I0201 07:52:05.025076 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerStarted","Data":"702dc17cb18a21165abe9746a1f17844f2e92224337167c86bf8897ffe5f9565"} Feb 01 07:52:05 crc kubenswrapper[4546]: I0201 07:52:05.028767 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:52:06 crc kubenswrapper[4546]: I0201 07:52:06.035487 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerStarted","Data":"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c"} Feb 01 07:52:09 crc kubenswrapper[4546]: I0201 07:52:09.059780 4546 generic.go:334] "Generic (PLEG): container finished" podID="f0156d9c-3699-413e-8903-8370eda3156f" containerID="bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c" exitCode=0 Feb 01 07:52:09 crc kubenswrapper[4546]: I0201 07:52:09.059869 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerDied","Data":"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c"} Feb 01 07:52:10 crc kubenswrapper[4546]: I0201 07:52:10.071503 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerStarted","Data":"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351"} Feb 01 07:52:10 crc kubenswrapper[4546]: I0201 07:52:10.094756 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ng5dz" podStartSLOduration=2.516913912 podStartE2EDuration="7.094733242s" podCreationTimestamp="2026-02-01 07:52:03 +0000 UTC" firstStartedPulling="2026-02-01 07:52:05.02806879 +0000 UTC m=+4155.679004806" lastFinishedPulling="2026-02-01 07:52:09.60588812 +0000 UTC m=+4160.256824136" observedRunningTime="2026-02-01 07:52:10.083896036 +0000 UTC m=+4160.734832052" watchObservedRunningTime="2026-02-01 07:52:10.094733242 +0000 UTC m=+4160.745669257" Feb 01 07:52:13 crc kubenswrapper[4546]: I0201 07:52:13.419797 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:13 crc kubenswrapper[4546]: I0201 07:52:13.420509 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:14 crc kubenswrapper[4546]: I0201 07:52:14.452978 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ng5dz" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="registry-server" probeResult="failure" output=< Feb 01 07:52:14 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 07:52:14 crc kubenswrapper[4546]: > Feb 01 07:52:23 crc kubenswrapper[4546]: I0201 07:52:23.454086 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:23 crc kubenswrapper[4546]: I0201 07:52:23.494408 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:23 crc kubenswrapper[4546]: I0201 07:52:23.688019 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.178884 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ng5dz" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="registry-server" containerID="cri-o://b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351" gracePeriod=2 Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.635545 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.796115 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4z8f\" (UniqueName: \"kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f\") pod \"f0156d9c-3699-413e-8903-8370eda3156f\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.796324 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities\") pod \"f0156d9c-3699-413e-8903-8370eda3156f\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.796489 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content\") pod \"f0156d9c-3699-413e-8903-8370eda3156f\" (UID: \"f0156d9c-3699-413e-8903-8370eda3156f\") " Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.797108 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities" (OuterVolumeSpecName: "utilities") pod "f0156d9c-3699-413e-8903-8370eda3156f" (UID: "f0156d9c-3699-413e-8903-8370eda3156f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.799112 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.806718 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f" (OuterVolumeSpecName: "kube-api-access-v4z8f") pod "f0156d9c-3699-413e-8903-8370eda3156f" (UID: "f0156d9c-3699-413e-8903-8370eda3156f"). InnerVolumeSpecName "kube-api-access-v4z8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.902061 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4z8f\" (UniqueName: \"kubernetes.io/projected/f0156d9c-3699-413e-8903-8370eda3156f-kube-api-access-v4z8f\") on node \"crc\" DevicePath \"\"" Feb 01 07:52:25 crc kubenswrapper[4546]: I0201 07:52:25.908784 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0156d9c-3699-413e-8903-8370eda3156f" (UID: "f0156d9c-3699-413e-8903-8370eda3156f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.004452 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0156d9c-3699-413e-8903-8370eda3156f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.189775 4546 generic.go:334] "Generic (PLEG): container finished" podID="f0156d9c-3699-413e-8903-8370eda3156f" containerID="b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351" exitCode=0 Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.189816 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerDied","Data":"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351"} Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.189843 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng5dz" event={"ID":"f0156d9c-3699-413e-8903-8370eda3156f","Type":"ContainerDied","Data":"702dc17cb18a21165abe9746a1f17844f2e92224337167c86bf8897ffe5f9565"} Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.189882 4546 scope.go:117] "RemoveContainer" containerID="b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.190022 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng5dz" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.220301 4546 scope.go:117] "RemoveContainer" containerID="bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.223022 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.231584 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ng5dz"] Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.253322 4546 scope.go:117] "RemoveContainer" containerID="eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.272806 4546 scope.go:117] "RemoveContainer" containerID="b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351" Feb 01 07:52:26 crc kubenswrapper[4546]: E0201 07:52:26.273506 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351\": container with ID starting with b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351 not found: ID does not exist" containerID="b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.273549 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351"} err="failed to get container status \"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351\": rpc error: code = NotFound desc = could not find container \"b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351\": container with ID starting with b800b7275142e7667e2560c2f762c8bec08e16e4fd7cd14c45e401b4e931e351 not found: ID does not exist" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.273594 4546 scope.go:117] "RemoveContainer" containerID="bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c" Feb 01 07:52:26 crc kubenswrapper[4546]: E0201 07:52:26.274035 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c\": container with ID starting with bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c not found: ID does not exist" containerID="bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.274081 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c"} err="failed to get container status \"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c\": rpc error: code = NotFound desc = could not find container \"bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c\": container with ID starting with bec9ddf34653fad6904a5c124d26eeeff207108005f018db7984772eab80784c not found: ID does not exist" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.274108 4546 scope.go:117] "RemoveContainer" containerID="eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1" Feb 01 07:52:26 crc kubenswrapper[4546]: E0201 07:52:26.274367 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1\": container with ID starting with eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1 not found: ID does not exist" containerID="eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1" Feb 01 07:52:26 crc kubenswrapper[4546]: I0201 07:52:26.274394 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1"} err="failed to get container status \"eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1\": rpc error: code = NotFound desc = could not find container \"eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1\": container with ID starting with eb092e303f851abea97356df185d55d7741ac42ae9296353ce68db2e21b9d6b1 not found: ID does not exist" Feb 01 07:52:27 crc kubenswrapper[4546]: I0201 07:52:27.663071 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0156d9c-3699-413e-8903-8370eda3156f" path="/var/lib/kubelet/pods/f0156d9c-3699-413e-8903-8370eda3156f/volumes" Feb 01 07:52:55 crc kubenswrapper[4546]: I0201 07:52:55.420884 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:52:55 crc kubenswrapper[4546]: I0201 07:52:55.421263 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:53:25 crc kubenswrapper[4546]: I0201 07:53:25.420926 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:53:25 crc kubenswrapper[4546]: I0201 07:53:25.421744 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.891968 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:39 crc kubenswrapper[4546]: E0201 07:53:39.893521 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="extract-content" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.893589 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="extract-content" Feb 01 07:53:39 crc kubenswrapper[4546]: E0201 07:53:39.894139 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="registry-server" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.894153 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="registry-server" Feb 01 07:53:39 crc kubenswrapper[4546]: E0201 07:53:39.894192 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="extract-utilities" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.894202 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="extract-utilities" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.894721 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0156d9c-3699-413e-8903-8370eda3156f" containerName="registry-server" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.898270 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.905992 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.944738 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.945151 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wd7r\" (UniqueName: \"kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:39 crc kubenswrapper[4546]: I0201 07:53:39.945353 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.047666 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.047993 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wd7r\" (UniqueName: \"kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.048053 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.048149 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.048459 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.065397 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wd7r\" (UniqueName: \"kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r\") pod \"redhat-marketplace-9hxb9\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.228914 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.708273 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:40 crc kubenswrapper[4546]: I0201 07:53:40.775328 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerStarted","Data":"58741d18552a2784f83880f126d73ade6986e994fa5017638f0192bcde5aba21"} Feb 01 07:53:41 crc kubenswrapper[4546]: I0201 07:53:41.786228 4546 generic.go:334] "Generic (PLEG): container finished" podID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerID="88fe0768b8d82b68ab157eb8e548bdef59cf2b39f61225877642d8af9d079484" exitCode=0 Feb 01 07:53:41 crc kubenswrapper[4546]: I0201 07:53:41.786325 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerDied","Data":"88fe0768b8d82b68ab157eb8e548bdef59cf2b39f61225877642d8af9d079484"} Feb 01 07:53:42 crc kubenswrapper[4546]: I0201 07:53:42.806354 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerStarted","Data":"0c49680c52f52dcbfdff207b84659bf4ee16f4b626654e836d45c833d9cc6dda"} Feb 01 07:53:43 crc kubenswrapper[4546]: I0201 07:53:43.815807 4546 generic.go:334] "Generic (PLEG): container finished" podID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerID="0c49680c52f52dcbfdff207b84659bf4ee16f4b626654e836d45c833d9cc6dda" exitCode=0 Feb 01 07:53:43 crc kubenswrapper[4546]: I0201 07:53:43.815910 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerDied","Data":"0c49680c52f52dcbfdff207b84659bf4ee16f4b626654e836d45c833d9cc6dda"} Feb 01 07:53:44 crc kubenswrapper[4546]: I0201 07:53:44.830984 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerStarted","Data":"32d5f10fa76224d43e1aad8184dc57b6e242d736ba026fe27e679117b64f0903"} Feb 01 07:53:44 crc kubenswrapper[4546]: I0201 07:53:44.848018 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9hxb9" podStartSLOduration=3.34230632 podStartE2EDuration="5.847991464s" podCreationTimestamp="2026-02-01 07:53:39 +0000 UTC" firstStartedPulling="2026-02-01 07:53:41.788726774 +0000 UTC m=+4252.439662790" lastFinishedPulling="2026-02-01 07:53:44.294411918 +0000 UTC m=+4254.945347934" observedRunningTime="2026-02-01 07:53:44.845447608 +0000 UTC m=+4255.496383623" watchObservedRunningTime="2026-02-01 07:53:44.847991464 +0000 UTC m=+4255.498927480" Feb 01 07:53:50 crc kubenswrapper[4546]: I0201 07:53:50.223088 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:50 crc kubenswrapper[4546]: I0201 07:53:50.223649 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:50 crc kubenswrapper[4546]: I0201 07:53:50.267302 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:50 crc kubenswrapper[4546]: I0201 07:53:50.949527 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.481508 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.482282 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9hxb9" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="registry-server" containerID="cri-o://32d5f10fa76224d43e1aad8184dc57b6e242d736ba026fe27e679117b64f0903" gracePeriod=2 Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.922902 4546 generic.go:334] "Generic (PLEG): container finished" podID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerID="32d5f10fa76224d43e1aad8184dc57b6e242d736ba026fe27e679117b64f0903" exitCode=0 Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.923084 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerDied","Data":"32d5f10fa76224d43e1aad8184dc57b6e242d736ba026fe27e679117b64f0903"} Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.923197 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9hxb9" event={"ID":"fd7f8013-8f46-4e6a-b947-9ddc23eccbae","Type":"ContainerDied","Data":"58741d18552a2784f83880f126d73ade6986e994fa5017638f0192bcde5aba21"} Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.923221 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58741d18552a2784f83880f126d73ade6986e994fa5017638f0192bcde5aba21" Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.931039 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.970341 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content\") pod \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.970504 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wd7r\" (UniqueName: \"kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r\") pod \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.970561 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities\") pod \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\" (UID: \"fd7f8013-8f46-4e6a-b947-9ddc23eccbae\") " Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.971428 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities" (OuterVolumeSpecName: "utilities") pod "fd7f8013-8f46-4e6a-b947-9ddc23eccbae" (UID: "fd7f8013-8f46-4e6a-b947-9ddc23eccbae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.985274 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd7f8013-8f46-4e6a-b947-9ddc23eccbae" (UID: "fd7f8013-8f46-4e6a-b947-9ddc23eccbae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:53:53 crc kubenswrapper[4546]: I0201 07:53:53.993427 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r" (OuterVolumeSpecName: "kube-api-access-2wd7r") pod "fd7f8013-8f46-4e6a-b947-9ddc23eccbae" (UID: "fd7f8013-8f46-4e6a-b947-9ddc23eccbae"). InnerVolumeSpecName "kube-api-access-2wd7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.075598 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.075646 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.075663 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wd7r\" (UniqueName: \"kubernetes.io/projected/fd7f8013-8f46-4e6a-b947-9ddc23eccbae-kube-api-access-2wd7r\") on node \"crc\" DevicePath \"\"" Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.932976 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9hxb9" Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.964228 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:54 crc kubenswrapper[4546]: I0201 07:53:54.969839 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9hxb9"] Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.420584 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.420952 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.420996 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.421899 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.421956 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" gracePeriod=600 Feb 01 07:53:55 crc kubenswrapper[4546]: E0201 07:53:55.538555 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.663826 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" path="/var/lib/kubelet/pods/fd7f8013-8f46-4e6a-b947-9ddc23eccbae/volumes" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.946999 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" exitCode=0 Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.947082 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2"} Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.947152 4546 scope.go:117] "RemoveContainer" containerID="6115e0b3fdcc72b8af0be87e7a86654b4d0ed1db57eac1a01b1c5bf3a9d1f1e0" Feb 01 07:53:55 crc kubenswrapper[4546]: I0201 07:53:55.947690 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:53:55 crc kubenswrapper[4546]: E0201 07:53:55.948065 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:54:08 crc kubenswrapper[4546]: I0201 07:54:08.655620 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:54:08 crc kubenswrapper[4546]: E0201 07:54:08.656648 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:54:21 crc kubenswrapper[4546]: I0201 07:54:21.655309 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:54:21 crc kubenswrapper[4546]: E0201 07:54:21.656937 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:54:32 crc kubenswrapper[4546]: I0201 07:54:32.655034 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:54:32 crc kubenswrapper[4546]: E0201 07:54:32.655774 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:54:47 crc kubenswrapper[4546]: I0201 07:54:47.654804 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:54:47 crc kubenswrapper[4546]: E0201 07:54:47.655908 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:55:00 crc kubenswrapper[4546]: I0201 07:55:00.655929 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:55:00 crc kubenswrapper[4546]: E0201 07:55:00.657145 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:55:12 crc kubenswrapper[4546]: I0201 07:55:12.655303 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:55:12 crc kubenswrapper[4546]: E0201 07:55:12.656825 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:55:26 crc kubenswrapper[4546]: I0201 07:55:26.654984 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:55:26 crc kubenswrapper[4546]: E0201 07:55:26.655732 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:55:38 crc kubenswrapper[4546]: I0201 07:55:38.655261 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:55:38 crc kubenswrapper[4546]: E0201 07:55:38.657172 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:55:51 crc kubenswrapper[4546]: I0201 07:55:51.654520 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:55:51 crc kubenswrapper[4546]: E0201 07:55:51.655469 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:56:04 crc kubenswrapper[4546]: I0201 07:56:04.655679 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:56:04 crc kubenswrapper[4546]: E0201 07:56:04.657020 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:56:18 crc kubenswrapper[4546]: I0201 07:56:18.656106 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:56:18 crc kubenswrapper[4546]: E0201 07:56:18.657098 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:56:32 crc kubenswrapper[4546]: I0201 07:56:32.655647 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:56:32 crc kubenswrapper[4546]: E0201 07:56:32.656424 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:56:45 crc kubenswrapper[4546]: I0201 07:56:45.655650 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:56:45 crc kubenswrapper[4546]: E0201 07:56:45.656398 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:56:58 crc kubenswrapper[4546]: I0201 07:56:58.655110 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:56:58 crc kubenswrapper[4546]: E0201 07:56:58.656192 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:57:13 crc kubenswrapper[4546]: I0201 07:57:13.655732 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:57:13 crc kubenswrapper[4546]: E0201 07:57:13.656946 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:57:26 crc kubenswrapper[4546]: I0201 07:57:26.655096 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:57:26 crc kubenswrapper[4546]: E0201 07:57:26.656083 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:57:39 crc kubenswrapper[4546]: I0201 07:57:39.662422 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:57:39 crc kubenswrapper[4546]: E0201 07:57:39.664543 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:57:50 crc kubenswrapper[4546]: I0201 07:57:50.654649 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:57:50 crc kubenswrapper[4546]: E0201 07:57:50.655420 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:58:05 crc kubenswrapper[4546]: I0201 07:58:05.655496 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:58:05 crc kubenswrapper[4546]: E0201 07:58:05.656374 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:58:16 crc kubenswrapper[4546]: I0201 07:58:16.655152 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:58:16 crc kubenswrapper[4546]: E0201 07:58:16.656084 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:58:28 crc kubenswrapper[4546]: I0201 07:58:28.655032 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:58:28 crc kubenswrapper[4546]: E0201 07:58:28.655619 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:58:39 crc kubenswrapper[4546]: I0201 07:58:39.660124 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:58:39 crc kubenswrapper[4546]: E0201 07:58:39.661806 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:58:54 crc kubenswrapper[4546]: I0201 07:58:54.655244 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:58:54 crc kubenswrapper[4546]: E0201 07:58:54.656035 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.362185 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:07 crc kubenswrapper[4546]: E0201 07:59:07.362950 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="extract-content" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.362963 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="extract-content" Feb 01 07:59:07 crc kubenswrapper[4546]: E0201 07:59:07.362983 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="extract-utilities" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.362988 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="extract-utilities" Feb 01 07:59:07 crc kubenswrapper[4546]: E0201 07:59:07.363000 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="registry-server" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.363005 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="registry-server" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.363175 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7f8013-8f46-4e6a-b947-9ddc23eccbae" containerName="registry-server" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.364314 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.383265 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.437568 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.437640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.437719 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhkrc\" (UniqueName: \"kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.541557 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.541908 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhkrc\" (UniqueName: \"kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.542228 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.542666 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.543014 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.565832 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhkrc\" (UniqueName: \"kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc\") pod \"certified-operators-hbbwn\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.654532 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 07:59:07 crc kubenswrapper[4546]: I0201 07:59:07.692581 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.197450 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.547662 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726"} Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.550475 4546 generic.go:334] "Generic (PLEG): container finished" podID="77339113-60e8-48a5-8ada-635e2e85fc68" containerID="851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8" exitCode=0 Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.550573 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerDied","Data":"851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8"} Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.550773 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerStarted","Data":"5e5f07e29bdfa72ea70b906f4f8e739ca194cf5f2da798806ea6fea5918c7b19"} Feb 01 07:59:08 crc kubenswrapper[4546]: I0201 07:59:08.552616 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:59:09 crc kubenswrapper[4546]: I0201 07:59:09.562569 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerStarted","Data":"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3"} Feb 01 07:59:10 crc kubenswrapper[4546]: I0201 07:59:10.571422 4546 generic.go:334] "Generic (PLEG): container finished" podID="77339113-60e8-48a5-8ada-635e2e85fc68" containerID="ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3" exitCode=0 Feb 01 07:59:10 crc kubenswrapper[4546]: I0201 07:59:10.571530 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerDied","Data":"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3"} Feb 01 07:59:11 crc kubenswrapper[4546]: I0201 07:59:11.583452 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerStarted","Data":"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3"} Feb 01 07:59:11 crc kubenswrapper[4546]: I0201 07:59:11.604556 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hbbwn" podStartSLOduration=2.10451663 podStartE2EDuration="4.60453593s" podCreationTimestamp="2026-02-01 07:59:07 +0000 UTC" firstStartedPulling="2026-02-01 07:59:08.552287319 +0000 UTC m=+4579.203223335" lastFinishedPulling="2026-02-01 07:59:11.05230662 +0000 UTC m=+4581.703242635" observedRunningTime="2026-02-01 07:59:11.603277668 +0000 UTC m=+4582.254213683" watchObservedRunningTime="2026-02-01 07:59:11.60453593 +0000 UTC m=+4582.255471946" Feb 01 07:59:17 crc kubenswrapper[4546]: I0201 07:59:17.692975 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:17 crc kubenswrapper[4546]: I0201 07:59:17.693669 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:17 crc kubenswrapper[4546]: I0201 07:59:17.759580 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:18 crc kubenswrapper[4546]: I0201 07:59:18.690050 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:18 crc kubenswrapper[4546]: I0201 07:59:18.732160 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:20 crc kubenswrapper[4546]: I0201 07:59:20.665619 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hbbwn" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="registry-server" containerID="cri-o://257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3" gracePeriod=2 Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.255759 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.355766 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities\") pod \"77339113-60e8-48a5-8ada-635e2e85fc68\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.355904 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhkrc\" (UniqueName: \"kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc\") pod \"77339113-60e8-48a5-8ada-635e2e85fc68\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.356008 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content\") pod \"77339113-60e8-48a5-8ada-635e2e85fc68\" (UID: \"77339113-60e8-48a5-8ada-635e2e85fc68\") " Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.357241 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities" (OuterVolumeSpecName: "utilities") pod "77339113-60e8-48a5-8ada-635e2e85fc68" (UID: "77339113-60e8-48a5-8ada-635e2e85fc68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.364111 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc" (OuterVolumeSpecName: "kube-api-access-nhkrc") pod "77339113-60e8-48a5-8ada-635e2e85fc68" (UID: "77339113-60e8-48a5-8ada-635e2e85fc68"). InnerVolumeSpecName "kube-api-access-nhkrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.394627 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77339113-60e8-48a5-8ada-635e2e85fc68" (UID: "77339113-60e8-48a5-8ada-635e2e85fc68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.458283 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.458314 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhkrc\" (UniqueName: \"kubernetes.io/projected/77339113-60e8-48a5-8ada-635e2e85fc68-kube-api-access-nhkrc\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.458324 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77339113-60e8-48a5-8ada-635e2e85fc68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.674187 4546 generic.go:334] "Generic (PLEG): container finished" podID="77339113-60e8-48a5-8ada-635e2e85fc68" containerID="257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3" exitCode=0 Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.674556 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerDied","Data":"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3"} Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.674584 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbbwn" event={"ID":"77339113-60e8-48a5-8ada-635e2e85fc68","Type":"ContainerDied","Data":"5e5f07e29bdfa72ea70b906f4f8e739ca194cf5f2da798806ea6fea5918c7b19"} Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.674604 4546 scope.go:117] "RemoveContainer" containerID="257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.674734 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbbwn" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.703448 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.706368 4546 scope.go:117] "RemoveContainer" containerID="ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.709156 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hbbwn"] Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.723063 4546 scope.go:117] "RemoveContainer" containerID="851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.754473 4546 scope.go:117] "RemoveContainer" containerID="257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3" Feb 01 07:59:21 crc kubenswrapper[4546]: E0201 07:59:21.754983 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3\": container with ID starting with 257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3 not found: ID does not exist" containerID="257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.755096 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3"} err="failed to get container status \"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3\": rpc error: code = NotFound desc = could not find container \"257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3\": container with ID starting with 257c55de880c7a2a358f8a8593af8019858f85e22c75dd28a437d2643fc07ba3 not found: ID does not exist" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.755124 4546 scope.go:117] "RemoveContainer" containerID="ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3" Feb 01 07:59:21 crc kubenswrapper[4546]: E0201 07:59:21.755487 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3\": container with ID starting with ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3 not found: ID does not exist" containerID="ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.755534 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3"} err="failed to get container status \"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3\": rpc error: code = NotFound desc = could not find container \"ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3\": container with ID starting with ba3f5722642f74fc420b293f70ecab0819668f0d84bbff54f8a6be9876bae9a3 not found: ID does not exist" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.755556 4546 scope.go:117] "RemoveContainer" containerID="851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8" Feb 01 07:59:21 crc kubenswrapper[4546]: E0201 07:59:21.755907 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8\": container with ID starting with 851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8 not found: ID does not exist" containerID="851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8" Feb 01 07:59:21 crc kubenswrapper[4546]: I0201 07:59:21.755931 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8"} err="failed to get container status \"851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8\": rpc error: code = NotFound desc = could not find container \"851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8\": container with ID starting with 851e9be2cd16d41c1d1f62c435104d0b69e5d68f26baeb1ec192ed022c60bbe8 not found: ID does not exist" Feb 01 07:59:23 crc kubenswrapper[4546]: I0201 07:59:23.664633 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" path="/var/lib/kubelet/pods/77339113-60e8-48a5-8ada-635e2e85fc68/volumes" Feb 01 07:59:58 crc kubenswrapper[4546]: I0201 07:59:58.557142 4546 scope.go:117] "RemoveContainer" containerID="0c49680c52f52dcbfdff207b84659bf4ee16f4b626654e836d45c833d9cc6dda" Feb 01 07:59:58 crc kubenswrapper[4546]: I0201 07:59:58.596911 4546 scope.go:117] "RemoveContainer" containerID="32d5f10fa76224d43e1aad8184dc57b6e242d736ba026fe27e679117b64f0903" Feb 01 07:59:58 crc kubenswrapper[4546]: I0201 07:59:58.625490 4546 scope.go:117] "RemoveContainer" containerID="88fe0768b8d82b68ab157eb8e548bdef59cf2b39f61225877642d8af9d079484" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.179096 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx"] Feb 01 08:00:00 crc kubenswrapper[4546]: E0201 08:00:00.180996 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="extract-content" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.181075 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="extract-content" Feb 01 08:00:00 crc kubenswrapper[4546]: E0201 08:00:00.181147 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.181194 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[4546]: E0201 08:00:00.181268 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="extract-utilities" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.181315 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="extract-utilities" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.181516 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="77339113-60e8-48a5-8ada-635e2e85fc68" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.182181 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.193495 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx"] Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.193674 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.197452 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.219310 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.219367 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwb87\" (UniqueName: \"kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.219405 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.321000 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.321092 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwb87\" (UniqueName: \"kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.321155 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.322178 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.329488 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.340850 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwb87\" (UniqueName: \"kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87\") pod \"collect-profiles-29498880-d8jrx\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.500904 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:00 crc kubenswrapper[4546]: I0201 08:00:00.923149 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx"] Feb 01 08:00:01 crc kubenswrapper[4546]: I0201 08:00:01.036383 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" event={"ID":"38f4ab5a-49be-4c8f-9803-620b76bea9e0","Type":"ContainerStarted","Data":"419da1325e8ffa6de3eeff48fc9528a02a0b3d678cdae9a4ceb7e3d10d49da2a"} Feb 01 08:00:02 crc kubenswrapper[4546]: I0201 08:00:02.044925 4546 generic.go:334] "Generic (PLEG): container finished" podID="38f4ab5a-49be-4c8f-9803-620b76bea9e0" containerID="e9f3083d4991f9c8b6c03036c9f71b017f1f5706dd2e38eaacbaa007a4dc187f" exitCode=0 Feb 01 08:00:02 crc kubenswrapper[4546]: I0201 08:00:02.045220 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" event={"ID":"38f4ab5a-49be-4c8f-9803-620b76bea9e0","Type":"ContainerDied","Data":"e9f3083d4991f9c8b6c03036c9f71b017f1f5706dd2e38eaacbaa007a4dc187f"} Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.804480 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.989685 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwb87\" (UniqueName: \"kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87\") pod \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.989810 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume\") pod \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.989909 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume\") pod \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\" (UID: \"38f4ab5a-49be-4c8f-9803-620b76bea9e0\") " Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.990427 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "38f4ab5a-49be-4c8f-9803-620b76bea9e0" (UID: "38f4ab5a-49be-4c8f-9803-620b76bea9e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.990876 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f4ab5a-49be-4c8f-9803-620b76bea9e0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:03 crc kubenswrapper[4546]: I0201 08:00:03.998532 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "38f4ab5a-49be-4c8f-9803-620b76bea9e0" (UID: "38f4ab5a-49be-4c8f-9803-620b76bea9e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.007014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87" (OuterVolumeSpecName: "kube-api-access-wwb87") pod "38f4ab5a-49be-4c8f-9803-620b76bea9e0" (UID: "38f4ab5a-49be-4c8f-9803-620b76bea9e0"). InnerVolumeSpecName "kube-api-access-wwb87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.063496 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" event={"ID":"38f4ab5a-49be-4c8f-9803-620b76bea9e0","Type":"ContainerDied","Data":"419da1325e8ffa6de3eeff48fc9528a02a0b3d678cdae9a4ceb7e3d10d49da2a"} Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.063537 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419da1325e8ffa6de3eeff48fc9528a02a0b3d678cdae9a4ceb7e3d10d49da2a" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.063600 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.095433 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/38f4ab5a-49be-4c8f-9803-620b76bea9e0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.095498 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwb87\" (UniqueName: \"kubernetes.io/projected/38f4ab5a-49be-4c8f-9803-620b76bea9e0-kube-api-access-wwb87\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.887701 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk"] Feb 01 08:00:04 crc kubenswrapper[4546]: I0201 08:00:04.893977 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-56thk"] Feb 01 08:00:05 crc kubenswrapper[4546]: I0201 08:00:05.666396 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86b3853-383a-4ffa-8256-b7c5ec09e580" path="/var/lib/kubelet/pods/f86b3853-383a-4ffa-8256-b7c5ec09e580/volumes" Feb 01 08:00:58 crc kubenswrapper[4546]: I0201 08:00:58.690962 4546 scope.go:117] "RemoveContainer" containerID="30b71ac108a05944748368aa998a9b332d5d58bfdcb0c8f6536af2877f9477b5" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.150941 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29498881-n8tws"] Feb 01 08:01:00 crc kubenswrapper[4546]: E0201 08:01:00.151668 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f4ab5a-49be-4c8f-9803-620b76bea9e0" containerName="collect-profiles" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.151683 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f4ab5a-49be-4c8f-9803-620b76bea9e0" containerName="collect-profiles" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.151879 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f4ab5a-49be-4c8f-9803-620b76bea9e0" containerName="collect-profiles" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.152476 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.160525 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498881-n8tws"] Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.310079 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.310144 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.310211 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvtv\" (UniqueName: \"kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.310302 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.412661 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.412740 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.412803 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvtv\" (UniqueName: \"kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.412911 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.417913 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.417925 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.418978 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.430157 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvtv\" (UniqueName: \"kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv\") pod \"keystone-cron-29498881-n8tws\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.469557 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:00 crc kubenswrapper[4546]: I0201 08:01:00.880378 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498881-n8tws"] Feb 01 08:01:01 crc kubenswrapper[4546]: I0201 08:01:01.543249 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498881-n8tws" event={"ID":"2177774b-dffa-45f7-86f9-f4136b098b38","Type":"ContainerStarted","Data":"79f6767078e3bd6f39b9b2e7f0d07b23e1b8f2634fcc907eb968f58c94952874"} Feb 01 08:01:01 crc kubenswrapper[4546]: I0201 08:01:01.543508 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498881-n8tws" event={"ID":"2177774b-dffa-45f7-86f9-f4136b098b38","Type":"ContainerStarted","Data":"04ed631b4d1e1d73f4090e98badcb7b478fef5739270a10883fe6d82f18fbab3"} Feb 01 08:01:01 crc kubenswrapper[4546]: I0201 08:01:01.560891 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29498881-n8tws" podStartSLOduration=1.560871724 podStartE2EDuration="1.560871724s" podCreationTimestamp="2026-02-01 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:01:01.558199336 +0000 UTC m=+4692.209135352" watchObservedRunningTime="2026-02-01 08:01:01.560871724 +0000 UTC m=+4692.211807740" Feb 01 08:01:03 crc kubenswrapper[4546]: I0201 08:01:03.557383 4546 generic.go:334] "Generic (PLEG): container finished" podID="2177774b-dffa-45f7-86f9-f4136b098b38" containerID="79f6767078e3bd6f39b9b2e7f0d07b23e1b8f2634fcc907eb968f58c94952874" exitCode=0 Feb 01 08:01:03 crc kubenswrapper[4546]: I0201 08:01:03.557457 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498881-n8tws" event={"ID":"2177774b-dffa-45f7-86f9-f4136b098b38","Type":"ContainerDied","Data":"79f6767078e3bd6f39b9b2e7f0d07b23e1b8f2634fcc907eb968f58c94952874"} Feb 01 08:01:04 crc kubenswrapper[4546]: I0201 08:01:04.833651 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.005439 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle\") pod \"2177774b-dffa-45f7-86f9-f4136b098b38\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.005497 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data\") pod \"2177774b-dffa-45f7-86f9-f4136b098b38\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.005598 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkvtv\" (UniqueName: \"kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv\") pod \"2177774b-dffa-45f7-86f9-f4136b098b38\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.005667 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys\") pod \"2177774b-dffa-45f7-86f9-f4136b098b38\" (UID: \"2177774b-dffa-45f7-86f9-f4136b098b38\") " Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.011968 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv" (OuterVolumeSpecName: "kube-api-access-mkvtv") pod "2177774b-dffa-45f7-86f9-f4136b098b38" (UID: "2177774b-dffa-45f7-86f9-f4136b098b38"). InnerVolumeSpecName "kube-api-access-mkvtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.029641 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2177774b-dffa-45f7-86f9-f4136b098b38" (UID: "2177774b-dffa-45f7-86f9-f4136b098b38"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.032160 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2177774b-dffa-45f7-86f9-f4136b098b38" (UID: "2177774b-dffa-45f7-86f9-f4136b098b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.048579 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data" (OuterVolumeSpecName: "config-data") pod "2177774b-dffa-45f7-86f9-f4136b098b38" (UID: "2177774b-dffa-45f7-86f9-f4136b098b38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.108726 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.108754 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.108764 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkvtv\" (UniqueName: \"kubernetes.io/projected/2177774b-dffa-45f7-86f9-f4136b098b38-kube-api-access-mkvtv\") on node \"crc\" DevicePath \"\"" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.108773 4546 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2177774b-dffa-45f7-86f9-f4136b098b38-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.573825 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498881-n8tws" event={"ID":"2177774b-dffa-45f7-86f9-f4136b098b38","Type":"ContainerDied","Data":"04ed631b4d1e1d73f4090e98badcb7b478fef5739270a10883fe6d82f18fbab3"} Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.573877 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04ed631b4d1e1d73f4090e98badcb7b478fef5739270a10883fe6d82f18fbab3" Feb 01 08:01:05 crc kubenswrapper[4546]: I0201 08:01:05.573895 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498881-n8tws" Feb 01 08:01:25 crc kubenswrapper[4546]: I0201 08:01:25.421352 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:01:25 crc kubenswrapper[4546]: I0201 08:01:25.421748 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.421112 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.421498 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.976823 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:01:55 crc kubenswrapper[4546]: E0201 08:01:55.977168 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2177774b-dffa-45f7-86f9-f4136b098b38" containerName="keystone-cron" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.977186 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2177774b-dffa-45f7-86f9-f4136b098b38" containerName="keystone-cron" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.977409 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2177774b-dffa-45f7-86f9-f4136b098b38" containerName="keystone-cron" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.979236 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:55 crc kubenswrapper[4546]: I0201 08:01:55.988486 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.169448 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqf84\" (UniqueName: \"kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.169546 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.169731 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.270906 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqf84\" (UniqueName: \"kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.270966 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.271052 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.271417 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.271465 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.288597 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqf84\" (UniqueName: \"kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84\") pod \"community-operators-rwhfn\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.295121 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.693719 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.958844 4546 generic.go:334] "Generic (PLEG): container finished" podID="1defbf07-3486-4cb2-8f6f-742506192e78" containerID="82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa" exitCode=0 Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.959071 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerDied","Data":"82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa"} Feb 01 08:01:56 crc kubenswrapper[4546]: I0201 08:01:56.959292 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerStarted","Data":"82acfbef1f204369c52f0b2374c26a20c157fab6c87d14e8011a6b13210f3f6d"} Feb 01 08:01:57 crc kubenswrapper[4546]: I0201 08:01:57.970472 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerStarted","Data":"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69"} Feb 01 08:01:58 crc kubenswrapper[4546]: I0201 08:01:58.978131 4546 generic.go:334] "Generic (PLEG): container finished" podID="1defbf07-3486-4cb2-8f6f-742506192e78" containerID="b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69" exitCode=0 Feb 01 08:01:58 crc kubenswrapper[4546]: I0201 08:01:58.978169 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerDied","Data":"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69"} Feb 01 08:01:59 crc kubenswrapper[4546]: I0201 08:01:59.987446 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerStarted","Data":"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c"} Feb 01 08:02:00 crc kubenswrapper[4546]: I0201 08:02:00.008506 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwhfn" podStartSLOduration=2.514949163 podStartE2EDuration="5.008484008s" podCreationTimestamp="2026-02-01 08:01:55 +0000 UTC" firstStartedPulling="2026-02-01 08:01:56.960496763 +0000 UTC m=+4747.611432779" lastFinishedPulling="2026-02-01 08:01:59.454031608 +0000 UTC m=+4750.104967624" observedRunningTime="2026-02-01 08:02:00.005582267 +0000 UTC m=+4750.656518283" watchObservedRunningTime="2026-02-01 08:02:00.008484008 +0000 UTC m=+4750.659420024" Feb 01 08:02:06 crc kubenswrapper[4546]: I0201 08:02:06.295631 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:06 crc kubenswrapper[4546]: I0201 08:02:06.296161 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:06 crc kubenswrapper[4546]: I0201 08:02:06.334386 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:07 crc kubenswrapper[4546]: I0201 08:02:07.088568 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:07 crc kubenswrapper[4546]: I0201 08:02:07.135607 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.059723 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwhfn" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="registry-server" containerID="cri-o://768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c" gracePeriod=2 Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.542997 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.653537 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content\") pod \"1defbf07-3486-4cb2-8f6f-742506192e78\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.668174 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities\") pod \"1defbf07-3486-4cb2-8f6f-742506192e78\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.668881 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities" (OuterVolumeSpecName: "utilities") pod "1defbf07-3486-4cb2-8f6f-742506192e78" (UID: "1defbf07-3486-4cb2-8f6f-742506192e78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.674939 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqf84\" (UniqueName: \"kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84\") pod \"1defbf07-3486-4cb2-8f6f-742506192e78\" (UID: \"1defbf07-3486-4cb2-8f6f-742506192e78\") " Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.676465 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.692311 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84" (OuterVolumeSpecName: "kube-api-access-zqf84") pod "1defbf07-3486-4cb2-8f6f-742506192e78" (UID: "1defbf07-3486-4cb2-8f6f-742506192e78"). InnerVolumeSpecName "kube-api-access-zqf84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.743763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1defbf07-3486-4cb2-8f6f-742506192e78" (UID: "1defbf07-3486-4cb2-8f6f-742506192e78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.778850 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1defbf07-3486-4cb2-8f6f-742506192e78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:02:09 crc kubenswrapper[4546]: I0201 08:02:09.778894 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqf84\" (UniqueName: \"kubernetes.io/projected/1defbf07-3486-4cb2-8f6f-742506192e78-kube-api-access-zqf84\") on node \"crc\" DevicePath \"\"" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.068680 4546 generic.go:334] "Generic (PLEG): container finished" podID="1defbf07-3486-4cb2-8f6f-742506192e78" containerID="768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c" exitCode=0 Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.068710 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhfn" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.068735 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerDied","Data":"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c"} Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.068901 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhfn" event={"ID":"1defbf07-3486-4cb2-8f6f-742506192e78","Type":"ContainerDied","Data":"82acfbef1f204369c52f0b2374c26a20c157fab6c87d14e8011a6b13210f3f6d"} Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.068919 4546 scope.go:117] "RemoveContainer" containerID="768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.088401 4546 scope.go:117] "RemoveContainer" containerID="b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.108178 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.113039 4546 scope.go:117] "RemoveContainer" containerID="82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.114978 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwhfn"] Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.138234 4546 scope.go:117] "RemoveContainer" containerID="768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c" Feb 01 08:02:10 crc kubenswrapper[4546]: E0201 08:02:10.138680 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c\": container with ID starting with 768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c not found: ID does not exist" containerID="768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.138712 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c"} err="failed to get container status \"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c\": rpc error: code = NotFound desc = could not find container \"768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c\": container with ID starting with 768679c8b698e7194263de41750f268b5c9620809b577969409309f372b2051c not found: ID does not exist" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.138733 4546 scope.go:117] "RemoveContainer" containerID="b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69" Feb 01 08:02:10 crc kubenswrapper[4546]: E0201 08:02:10.138975 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69\": container with ID starting with b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69 not found: ID does not exist" containerID="b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.138996 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69"} err="failed to get container status \"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69\": rpc error: code = NotFound desc = could not find container \"b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69\": container with ID starting with b5a732d210025a25c5ce8b7726c4d241e514315499effd700adcb9943c6e2d69 not found: ID does not exist" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.139012 4546 scope.go:117] "RemoveContainer" containerID="82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa" Feb 01 08:02:10 crc kubenswrapper[4546]: E0201 08:02:10.139298 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa\": container with ID starting with 82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa not found: ID does not exist" containerID="82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa" Feb 01 08:02:10 crc kubenswrapper[4546]: I0201 08:02:10.139319 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa"} err="failed to get container status \"82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa\": rpc error: code = NotFound desc = could not find container \"82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa\": container with ID starting with 82468b1e36e7ccbc97cec35169e219e1b1146fcb6048e97d6e0dd10c3c6610aa not found: ID does not exist" Feb 01 08:02:11 crc kubenswrapper[4546]: I0201 08:02:11.665153 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" path="/var/lib/kubelet/pods/1defbf07-3486-4cb2-8f6f-742506192e78/volumes" Feb 01 08:02:25 crc kubenswrapper[4546]: I0201 08:02:25.421234 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:02:25 crc kubenswrapper[4546]: I0201 08:02:25.421813 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:02:25 crc kubenswrapper[4546]: I0201 08:02:25.421886 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:02:25 crc kubenswrapper[4546]: I0201 08:02:25.423225 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:02:25 crc kubenswrapper[4546]: I0201 08:02:25.423311 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726" gracePeriod=600 Feb 01 08:02:26 crc kubenswrapper[4546]: I0201 08:02:26.188199 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726" exitCode=0 Feb 01 08:02:26 crc kubenswrapper[4546]: I0201 08:02:26.188267 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726"} Feb 01 08:02:26 crc kubenswrapper[4546]: I0201 08:02:26.188743 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8"} Feb 01 08:02:26 crc kubenswrapper[4546]: I0201 08:02:26.188773 4546 scope.go:117] "RemoveContainer" containerID="ac27578301affa10a61c523acbf1587a86e6ec6ceea8c8df098957eaa4996bd2" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.692357 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:28 crc kubenswrapper[4546]: E0201 08:04:28.696144 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="registry-server" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.696171 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="registry-server" Feb 01 08:04:28 crc kubenswrapper[4546]: E0201 08:04:28.696193 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="extract-content" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.696199 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="extract-content" Feb 01 08:04:28 crc kubenswrapper[4546]: E0201 08:04:28.696222 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="extract-utilities" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.696229 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="extract-utilities" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.696459 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1defbf07-3486-4cb2-8f6f-742506192e78" containerName="registry-server" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.697695 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.705710 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.718095 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.718147 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmktr\" (UniqueName: \"kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.718175 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.821046 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.821220 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmktr\" (UniqueName: \"kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.821269 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.821557 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.821824 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:28 crc kubenswrapper[4546]: I0201 08:04:28.844485 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmktr\" (UniqueName: \"kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr\") pod \"redhat-marketplace-kg7rr\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:29 crc kubenswrapper[4546]: I0201 08:04:29.013309 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:29 crc kubenswrapper[4546]: I0201 08:04:29.491553 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:30 crc kubenswrapper[4546]: I0201 08:04:30.350117 4546 generic.go:334] "Generic (PLEG): container finished" podID="70c04dd9-153a-40bc-a303-11dcd4809384" containerID="b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248" exitCode=0 Feb 01 08:04:30 crc kubenswrapper[4546]: I0201 08:04:30.350178 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerDied","Data":"b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248"} Feb 01 08:04:30 crc kubenswrapper[4546]: I0201 08:04:30.350534 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerStarted","Data":"21354707474d5d0f5e53aebaa635913d1470cd3f04980f323ebdfcc761631185"} Feb 01 08:04:30 crc kubenswrapper[4546]: I0201 08:04:30.355689 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:04:31 crc kubenswrapper[4546]: I0201 08:04:31.366724 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerStarted","Data":"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc"} Feb 01 08:04:32 crc kubenswrapper[4546]: I0201 08:04:32.377582 4546 generic.go:334] "Generic (PLEG): container finished" podID="70c04dd9-153a-40bc-a303-11dcd4809384" containerID="efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc" exitCode=0 Feb 01 08:04:32 crc kubenswrapper[4546]: I0201 08:04:32.377695 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerDied","Data":"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc"} Feb 01 08:04:33 crc kubenswrapper[4546]: I0201 08:04:33.388753 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerStarted","Data":"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565"} Feb 01 08:04:33 crc kubenswrapper[4546]: I0201 08:04:33.408352 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kg7rr" podStartSLOduration=2.864731592 podStartE2EDuration="5.408335366s" podCreationTimestamp="2026-02-01 08:04:28 +0000 UTC" firstStartedPulling="2026-02-01 08:04:30.355380869 +0000 UTC m=+4901.006316886" lastFinishedPulling="2026-02-01 08:04:32.898984644 +0000 UTC m=+4903.549920660" observedRunningTime="2026-02-01 08:04:33.405843598 +0000 UTC m=+4904.056779614" watchObservedRunningTime="2026-02-01 08:04:33.408335366 +0000 UTC m=+4904.059271372" Feb 01 08:04:34 crc kubenswrapper[4546]: I0201 08:04:34.877266 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:34 crc kubenswrapper[4546]: I0201 08:04:34.879110 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:34 crc kubenswrapper[4546]: I0201 08:04:34.893425 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.067402 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drtl\" (UniqueName: \"kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.067848 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.068155 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.170971 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2drtl\" (UniqueName: \"kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.171733 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.172728 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.172804 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.173063 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.195518 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drtl\" (UniqueName: \"kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl\") pod \"redhat-operators-wdlhz\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.200391 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:35 crc kubenswrapper[4546]: I0201 08:04:35.737603 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:36 crc kubenswrapper[4546]: I0201 08:04:36.419670 4546 generic.go:334] "Generic (PLEG): container finished" podID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerID="bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284" exitCode=0 Feb 01 08:04:36 crc kubenswrapper[4546]: I0201 08:04:36.419768 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerDied","Data":"bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284"} Feb 01 08:04:36 crc kubenswrapper[4546]: I0201 08:04:36.420076 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerStarted","Data":"6e47608b3eb971146e2d0a026cd48f02139a1320cb4563c364d99ae9f677f9f0"} Feb 01 08:04:37 crc kubenswrapper[4546]: I0201 08:04:37.429852 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerStarted","Data":"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9"} Feb 01 08:04:39 crc kubenswrapper[4546]: I0201 08:04:39.014683 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:39 crc kubenswrapper[4546]: I0201 08:04:39.015213 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:39 crc kubenswrapper[4546]: I0201 08:04:39.074516 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:39 crc kubenswrapper[4546]: I0201 08:04:39.859991 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:40 crc kubenswrapper[4546]: I0201 08:04:40.673138 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:41 crc kubenswrapper[4546]: I0201 08:04:41.462843 4546 generic.go:334] "Generic (PLEG): container finished" podID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerID="d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9" exitCode=0 Feb 01 08:04:41 crc kubenswrapper[4546]: I0201 08:04:41.462900 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerDied","Data":"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9"} Feb 01 08:04:41 crc kubenswrapper[4546]: I0201 08:04:41.464889 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kg7rr" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="registry-server" containerID="cri-o://dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565" gracePeriod=2 Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.105920 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.239249 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content\") pod \"70c04dd9-153a-40bc-a303-11dcd4809384\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.239674 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmktr\" (UniqueName: \"kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr\") pod \"70c04dd9-153a-40bc-a303-11dcd4809384\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.239828 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities\") pod \"70c04dd9-153a-40bc-a303-11dcd4809384\" (UID: \"70c04dd9-153a-40bc-a303-11dcd4809384\") " Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.241893 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities" (OuterVolumeSpecName: "utilities") pod "70c04dd9-153a-40bc-a303-11dcd4809384" (UID: "70c04dd9-153a-40bc-a303-11dcd4809384"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.253669 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70c04dd9-153a-40bc-a303-11dcd4809384" (UID: "70c04dd9-153a-40bc-a303-11dcd4809384"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.253853 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr" (OuterVolumeSpecName: "kube-api-access-nmktr") pod "70c04dd9-153a-40bc-a303-11dcd4809384" (UID: "70c04dd9-153a-40bc-a303-11dcd4809384"). InnerVolumeSpecName "kube-api-access-nmktr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.342910 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.343065 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmktr\" (UniqueName: \"kubernetes.io/projected/70c04dd9-153a-40bc-a303-11dcd4809384-kube-api-access-nmktr\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.343139 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c04dd9-153a-40bc-a303-11dcd4809384-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.474292 4546 generic.go:334] "Generic (PLEG): container finished" podID="70c04dd9-153a-40bc-a303-11dcd4809384" containerID="dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565" exitCode=0 Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.474385 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg7rr" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.474390 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerDied","Data":"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565"} Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.474437 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg7rr" event={"ID":"70c04dd9-153a-40bc-a303-11dcd4809384","Type":"ContainerDied","Data":"21354707474d5d0f5e53aebaa635913d1470cd3f04980f323ebdfcc761631185"} Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.474475 4546 scope.go:117] "RemoveContainer" containerID="dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.480123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerStarted","Data":"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132"} Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.504207 4546 scope.go:117] "RemoveContainer" containerID="efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.531926 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdlhz" podStartSLOduration=3.017681543 podStartE2EDuration="8.531900036s" podCreationTimestamp="2026-02-01 08:04:34 +0000 UTC" firstStartedPulling="2026-02-01 08:04:36.423746911 +0000 UTC m=+4907.074682928" lastFinishedPulling="2026-02-01 08:04:41.937965404 +0000 UTC m=+4912.588901421" observedRunningTime="2026-02-01 08:04:42.510072564 +0000 UTC m=+4913.161008580" watchObservedRunningTime="2026-02-01 08:04:42.531900036 +0000 UTC m=+4913.182836073" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.550840 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.556228 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg7rr"] Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.557040 4546 scope.go:117] "RemoveContainer" containerID="b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.578330 4546 scope.go:117] "RemoveContainer" containerID="dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565" Feb 01 08:04:42 crc kubenswrapper[4546]: E0201 08:04:42.578851 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565\": container with ID starting with dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565 not found: ID does not exist" containerID="dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.578964 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565"} err="failed to get container status \"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565\": rpc error: code = NotFound desc = could not find container \"dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565\": container with ID starting with dc493ba3fc22754aeaf0cbb37ff73539339528f177b99008fab85febc86f4565 not found: ID does not exist" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.579052 4546 scope.go:117] "RemoveContainer" containerID="efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc" Feb 01 08:04:42 crc kubenswrapper[4546]: E0201 08:04:42.579427 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc\": container with ID starting with efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc not found: ID does not exist" containerID="efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.579528 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc"} err="failed to get container status \"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc\": rpc error: code = NotFound desc = could not find container \"efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc\": container with ID starting with efb04a924e0893c5bb779e36b998420dc6679697b422d5180aeac81152d480fc not found: ID does not exist" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.579598 4546 scope.go:117] "RemoveContainer" containerID="b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248" Feb 01 08:04:42 crc kubenswrapper[4546]: E0201 08:04:42.579985 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248\": container with ID starting with b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248 not found: ID does not exist" containerID="b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248" Feb 01 08:04:42 crc kubenswrapper[4546]: I0201 08:04:42.580022 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248"} err="failed to get container status \"b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248\": rpc error: code = NotFound desc = could not find container \"b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248\": container with ID starting with b4d088f06ded755411e7c3c0b349da4044068a380bc7e7904c4ccaeb3b830248 not found: ID does not exist" Feb 01 08:04:43 crc kubenswrapper[4546]: I0201 08:04:43.666804 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" path="/var/lib/kubelet/pods/70c04dd9-153a-40bc-a303-11dcd4809384/volumes" Feb 01 08:04:45 crc kubenswrapper[4546]: I0201 08:04:45.200633 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:45 crc kubenswrapper[4546]: I0201 08:04:45.201025 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:46 crc kubenswrapper[4546]: I0201 08:04:46.244931 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wdlhz" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="registry-server" probeResult="failure" output=< Feb 01 08:04:46 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:04:46 crc kubenswrapper[4546]: > Feb 01 08:04:55 crc kubenswrapper[4546]: I0201 08:04:55.245944 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:55 crc kubenswrapper[4546]: I0201 08:04:55.297288 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:55 crc kubenswrapper[4546]: I0201 08:04:55.421353 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:04:55 crc kubenswrapper[4546]: I0201 08:04:55.421431 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:04:55 crc kubenswrapper[4546]: I0201 08:04:55.493403 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:56 crc kubenswrapper[4546]: I0201 08:04:56.647342 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wdlhz" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="registry-server" containerID="cri-o://be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132" gracePeriod=2 Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.027953 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.112186 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content\") pod \"618ee999-ab83-4e1e-a252-ebb7c9f22602\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.112278 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2drtl\" (UniqueName: \"kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl\") pod \"618ee999-ab83-4e1e-a252-ebb7c9f22602\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.112309 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities\") pod \"618ee999-ab83-4e1e-a252-ebb7c9f22602\" (UID: \"618ee999-ab83-4e1e-a252-ebb7c9f22602\") " Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.117159 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities" (OuterVolumeSpecName: "utilities") pod "618ee999-ab83-4e1e-a252-ebb7c9f22602" (UID: "618ee999-ab83-4e1e-a252-ebb7c9f22602"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.136483 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl" (OuterVolumeSpecName: "kube-api-access-2drtl") pod "618ee999-ab83-4e1e-a252-ebb7c9f22602" (UID: "618ee999-ab83-4e1e-a252-ebb7c9f22602"). InnerVolumeSpecName "kube-api-access-2drtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.216053 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2drtl\" (UniqueName: \"kubernetes.io/projected/618ee999-ab83-4e1e-a252-ebb7c9f22602-kube-api-access-2drtl\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.216086 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.233896 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "618ee999-ab83-4e1e-a252-ebb7c9f22602" (UID: "618ee999-ab83-4e1e-a252-ebb7c9f22602"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.317058 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618ee999-ab83-4e1e-a252-ebb7c9f22602-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.659216 4546 generic.go:334] "Generic (PLEG): container finished" podID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerID="be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132" exitCode=0 Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.659410 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdlhz" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.668527 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerDied","Data":"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132"} Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.668581 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdlhz" event={"ID":"618ee999-ab83-4e1e-a252-ebb7c9f22602","Type":"ContainerDied","Data":"6e47608b3eb971146e2d0a026cd48f02139a1320cb4563c364d99ae9f677f9f0"} Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.668610 4546 scope.go:117] "RemoveContainer" containerID="be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.695757 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.711579 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wdlhz"] Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.719413 4546 scope.go:117] "RemoveContainer" containerID="d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.740645 4546 scope.go:117] "RemoveContainer" containerID="bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.778228 4546 scope.go:117] "RemoveContainer" containerID="be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132" Feb 01 08:04:57 crc kubenswrapper[4546]: E0201 08:04:57.778596 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132\": container with ID starting with be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132 not found: ID does not exist" containerID="be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.778681 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132"} err="failed to get container status \"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132\": rpc error: code = NotFound desc = could not find container \"be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132\": container with ID starting with be8622a1aa650236c544b1e0efe8229f2d1e5348b74da9fe6cf1ad18e8d96132 not found: ID does not exist" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.778754 4546 scope.go:117] "RemoveContainer" containerID="d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9" Feb 01 08:04:57 crc kubenswrapper[4546]: E0201 08:04:57.778975 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9\": container with ID starting with d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9 not found: ID does not exist" containerID="d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.779064 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9"} err="failed to get container status \"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9\": rpc error: code = NotFound desc = could not find container \"d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9\": container with ID starting with d59e363c10b955bc65cfd87bda63b0eb99fa6012b987c0f886ba35276ba2e2a9 not found: ID does not exist" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.779080 4546 scope.go:117] "RemoveContainer" containerID="bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284" Feb 01 08:04:57 crc kubenswrapper[4546]: E0201 08:04:57.779569 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284\": container with ID starting with bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284 not found: ID does not exist" containerID="bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284" Feb 01 08:04:57 crc kubenswrapper[4546]: I0201 08:04:57.779636 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284"} err="failed to get container status \"bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284\": rpc error: code = NotFound desc = could not find container \"bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284\": container with ID starting with bbd596f5396f74d1f06625a6095f84f88ef19434987ec4baccb2733ad3256284 not found: ID does not exist" Feb 01 08:04:59 crc kubenswrapper[4546]: I0201 08:04:59.667519 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" path="/var/lib/kubelet/pods/618ee999-ab83-4e1e-a252-ebb7c9f22602/volumes" Feb 01 08:05:25 crc kubenswrapper[4546]: I0201 08:05:25.420621 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:05:25 crc kubenswrapper[4546]: I0201 08:05:25.421382 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:05:43 crc kubenswrapper[4546]: I0201 08:05:43.066147 4546 generic.go:334] "Generic (PLEG): container finished" podID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" containerID="c93b3611c4519ecf1f1de2a36e96b22725af693d46fb02cd89314b7a502a3562" exitCode=0 Feb 01 08:05:43 crc kubenswrapper[4546]: I0201 08:05:43.066235 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"d08de2e0-03ce-4faf-84fa-45d69c70a38b","Type":"ContainerDied","Data":"c93b3611c4519ecf1f1de2a36e96b22725af693d46fb02cd89314b7a502a3562"} Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.546837 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.676752 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677208 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" containerName="tempest-tests-tempest-tests-runner" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677233 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" containerName="tempest-tests-tempest-tests-runner" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677254 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="extract-content" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677260 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="extract-content" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677272 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="extract-utilities" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677277 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="extract-utilities" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677284 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="extract-utilities" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677289 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="extract-utilities" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677300 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="extract-content" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677305 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="extract-content" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677316 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677321 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: E0201 08:05:44.677339 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677343 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677546 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08de2e0-03ce-4faf-84fa-45d69c70a38b" containerName="tempest-tests-tempest-tests-runner" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677559 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c04dd9-153a-40bc-a303-11dcd4809384" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.677581 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="618ee999-ab83-4e1e-a252-ebb7c9f22602" containerName="registry-server" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.679772 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.692766 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.695044 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.700461 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714195 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjtlz\" (UniqueName: \"kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714232 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714270 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714345 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714491 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714530 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714630 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714749 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.714799 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir\") pod \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\" (UID: \"d08de2e0-03ce-4faf-84fa-45d69c70a38b\") " Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.717485 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.722777 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.729751 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data" (OuterVolumeSpecName: "config-data") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.737034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz" (OuterVolumeSpecName: "kube-api-access-hjtlz") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "kube-api-access-hjtlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.754937 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.757228 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.770879 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.771193 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.779757 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d08de2e0-03ce-4faf-84fa-45d69c70a38b" (UID: "d08de2e0-03ce-4faf-84fa-45d69c70a38b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.818675 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819059 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819106 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819203 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819292 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819503 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819557 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819660 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819707 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwlzl\" (UniqueName: \"kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819938 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjtlz\" (UniqueName: \"kubernetes.io/projected/d08de2e0-03ce-4faf-84fa-45d69c70a38b-kube-api-access-hjtlz\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819957 4546 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819966 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819975 4546 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819985 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.819997 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.820007 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d08de2e0-03ce-4faf-84fa-45d69c70a38b-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.820017 4546 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d08de2e0-03ce-4faf-84fa-45d69c70a38b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.846135 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921522 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921584 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921624 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921669 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921732 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921755 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921797 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.921823 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwlzl\" (UniqueName: \"kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.922280 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.922347 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.923029 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.923038 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.925760 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.926391 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.927630 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:44 crc kubenswrapper[4546]: I0201 08:05:44.937801 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwlzl\" (UniqueName: \"kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl\") pod \"tempest-tests-tempest-s01-single-thread-testing\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:45 crc kubenswrapper[4546]: I0201 08:05:45.005635 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 08:05:45 crc kubenswrapper[4546]: I0201 08:05:45.103055 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" event={"ID":"d08de2e0-03ce-4faf-84fa-45d69c70a38b","Type":"ContainerDied","Data":"cf786f664c054b5666e270e0dccc98c16f486d6afc268888e98e95a6704fc19f"} Feb 01 08:05:45 crc kubenswrapper[4546]: I0201 08:05:45.103128 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-multi-thread-testing" Feb 01 08:05:45 crc kubenswrapper[4546]: I0201 08:05:45.103343 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf786f664c054b5666e270e0dccc98c16f486d6afc268888e98e95a6704fc19f" Feb 01 08:05:45 crc kubenswrapper[4546]: I0201 08:05:45.492212 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-thread-testing"] Feb 01 08:05:46 crc kubenswrapper[4546]: I0201 08:05:46.115989 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"297ca525-97ec-433c-82ec-01cf98fb4c52","Type":"ContainerStarted","Data":"bab04943ac2ed8931f09153854358dfd3bed550313da79291606936b00890b43"} Feb 01 08:05:48 crc kubenswrapper[4546]: I0201 08:05:48.151247 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"297ca525-97ec-433c-82ec-01cf98fb4c52","Type":"ContainerStarted","Data":"7a91d65171a04094906210eccb8074bba06bad8108d8412ec982eacb6b48bc97"} Feb 01 08:05:48 crc kubenswrapper[4546]: I0201 08:05:48.167450 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" podStartSLOduration=4.167424256 podStartE2EDuration="4.167424256s" podCreationTimestamp="2026-02-01 08:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:05:48.165416591 +0000 UTC m=+4978.816352607" watchObservedRunningTime="2026-02-01 08:05:48.167424256 +0000 UTC m=+4978.818360262" Feb 01 08:05:55 crc kubenswrapper[4546]: I0201 08:05:55.421395 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:05:55 crc kubenswrapper[4546]: I0201 08:05:55.422141 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:05:55 crc kubenswrapper[4546]: I0201 08:05:55.422186 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:05:55 crc kubenswrapper[4546]: I0201 08:05:55.422887 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:05:55 crc kubenswrapper[4546]: I0201 08:05:55.422934 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" gracePeriod=600 Feb 01 08:05:55 crc kubenswrapper[4546]: E0201 08:05:55.541074 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:05:56 crc kubenswrapper[4546]: I0201 08:05:56.221316 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" exitCode=0 Feb 01 08:05:56 crc kubenswrapper[4546]: I0201 08:05:56.221395 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8"} Feb 01 08:05:56 crc kubenswrapper[4546]: I0201 08:05:56.221719 4546 scope.go:117] "RemoveContainer" containerID="2cbff502824aa7d43ad4068ab9e561311498bbd9e7840870704d4dc72b3dc726" Feb 01 08:05:56 crc kubenswrapper[4546]: I0201 08:05:56.223089 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:05:56 crc kubenswrapper[4546]: E0201 08:05:56.224176 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:06:10 crc kubenswrapper[4546]: I0201 08:06:10.655468 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:06:10 crc kubenswrapper[4546]: E0201 08:06:10.656714 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:06:25 crc kubenswrapper[4546]: I0201 08:06:25.655472 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:06:25 crc kubenswrapper[4546]: E0201 08:06:25.656759 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:06:32 crc kubenswrapper[4546]: I0201 08:06:32.910181 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:06:32 crc kubenswrapper[4546]: I0201 08:06:32.914149 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:32 crc kubenswrapper[4546]: I0201 08:06:32.930476 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073122 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z4w4\" (UniqueName: \"kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073226 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073354 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073431 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073491 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073572 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.073641 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.176735 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.176899 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z4w4\" (UniqueName: \"kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.176964 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.177082 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.177163 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.177230 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.177322 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.183693 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.184369 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.184609 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.186697 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.191052 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.192296 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.197283 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z4w4\" (UniqueName: \"kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4\") pod \"neutron-5d9bbd7fcc-ddszx\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.230001 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:33 crc kubenswrapper[4546]: I0201 08:06:33.814937 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:06:34 crc kubenswrapper[4546]: I0201 08:06:34.571428 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerStarted","Data":"201e45fb339bad634f368fe4234f202a0e4b7498680b7a8faae312806b4765e2"} Feb 01 08:06:34 crc kubenswrapper[4546]: I0201 08:06:34.572466 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:06:34 crc kubenswrapper[4546]: I0201 08:06:34.572540 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerStarted","Data":"85b2c2df3860ae2f5c4820cde0f230c9754869bc81c7299dbe288950bc1e0636"} Feb 01 08:06:34 crc kubenswrapper[4546]: I0201 08:06:34.572611 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerStarted","Data":"3d70bc5b4d2135051dd4ac42b8564487b1705cb3e47ea7937f5fbfd00acc7c8d"} Feb 01 08:06:34 crc kubenswrapper[4546]: I0201 08:06:34.594445 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d9bbd7fcc-ddszx" podStartSLOduration=2.5944276139999998 podStartE2EDuration="2.594427614s" podCreationTimestamp="2026-02-01 08:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:06:34.591103366 +0000 UTC m=+5025.242039382" watchObservedRunningTime="2026-02-01 08:06:34.594427614 +0000 UTC m=+5025.245363630" Feb 01 08:06:37 crc kubenswrapper[4546]: I0201 08:06:37.654830 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:06:37 crc kubenswrapper[4546]: E0201 08:06:37.657202 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:06:49 crc kubenswrapper[4546]: I0201 08:06:49.659672 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:06:49 crc kubenswrapper[4546]: E0201 08:06:49.660529 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:07:00 crc kubenswrapper[4546]: I0201 08:07:00.655161 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:07:00 crc kubenswrapper[4546]: E0201 08:07:00.656127 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.246573 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.310277 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.310546 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f65868d9-5zt7q" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-api" containerID="cri-o://5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe" gracePeriod=30 Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.311015 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-76f65868d9-5zt7q" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-httpd" containerID="cri-o://107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6" gracePeriod=30 Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.840004 4546 generic.go:334] "Generic (PLEG): container finished" podID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerID="107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6" exitCode=0 Feb 01 08:07:03 crc kubenswrapper[4546]: I0201 08:07:03.840093 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerDied","Data":"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6"} Feb 01 08:07:11 crc kubenswrapper[4546]: I0201 08:07:11.656696 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:07:11 crc kubenswrapper[4546]: E0201 08:07:11.657388 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.838715 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.977914 4546 generic.go:334] "Generic (PLEG): container finished" podID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerID="5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe" exitCode=0 Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.977983 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerDied","Data":"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe"} Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.978032 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76f65868d9-5zt7q" event={"ID":"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6","Type":"ContainerDied","Data":"fa42d5e7be806b1baf4c7998c69f5d97904289206e922760eec79dfc2ce913ad"} Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.978051 4546 scope.go:117] "RemoveContainer" containerID="107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6" Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.978438 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76f65868d9-5zt7q" Feb 01 08:07:16 crc kubenswrapper[4546]: I0201 08:07:16.999615 4546 scope.go:117] "RemoveContainer" containerID="5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.021525 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.021663 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.021813 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkzsh\" (UniqueName: \"kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.021891 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.022272 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.022326 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.022364 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config\") pod \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\" (UID: \"0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6\") " Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.033304 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh" (OuterVolumeSpecName: "kube-api-access-lkzsh") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "kube-api-access-lkzsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.033495 4546 scope.go:117] "RemoveContainer" containerID="107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6" Feb 01 08:07:17 crc kubenswrapper[4546]: E0201 08:07:17.034079 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6\": container with ID starting with 107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6 not found: ID does not exist" containerID="107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.034116 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6"} err="failed to get container status \"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6\": rpc error: code = NotFound desc = could not find container \"107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6\": container with ID starting with 107c44d1973ddf83764d2371d05ba7e80bf92c740aa4056d12a38fd08f7871d6 not found: ID does not exist" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.034145 4546 scope.go:117] "RemoveContainer" containerID="5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe" Feb 01 08:07:17 crc kubenswrapper[4546]: E0201 08:07:17.036467 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe\": container with ID starting with 5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe not found: ID does not exist" containerID="5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.036506 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe"} err="failed to get container status \"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe\": rpc error: code = NotFound desc = could not find container \"5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe\": container with ID starting with 5d34952d137ab605b3b89d9fe0c982488146adad60d53993b0a923e9182b87fe not found: ID does not exist" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.056286 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.125897 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.125923 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkzsh\" (UniqueName: \"kubernetes.io/projected/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-kube-api-access-lkzsh\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.137034 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.158138 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.159504 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.162238 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config" (OuterVolumeSpecName: "config") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.168665 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" (UID: "0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.227789 4546 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.227816 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.227826 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.227836 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.227846 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.314627 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.329964 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76f65868d9-5zt7q"] Feb 01 08:07:17 crc kubenswrapper[4546]: I0201 08:07:17.668195 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" path="/var/lib/kubelet/pods/0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6/volumes" Feb 01 08:07:25 crc kubenswrapper[4546]: I0201 08:07:25.655345 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:07:25 crc kubenswrapper[4546]: E0201 08:07:25.656274 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:07:37 crc kubenswrapper[4546]: I0201 08:07:37.654655 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:07:37 crc kubenswrapper[4546]: E0201 08:07:37.655564 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:07:51 crc kubenswrapper[4546]: I0201 08:07:51.654705 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:07:51 crc kubenswrapper[4546]: E0201 08:07:51.655622 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:08:03 crc kubenswrapper[4546]: I0201 08:08:03.654607 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:08:03 crc kubenswrapper[4546]: E0201 08:08:03.655396 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:08:14 crc kubenswrapper[4546]: I0201 08:08:14.655356 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:08:14 crc kubenswrapper[4546]: E0201 08:08:14.656021 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:08:28 crc kubenswrapper[4546]: I0201 08:08:28.655313 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:08:28 crc kubenswrapper[4546]: E0201 08:08:28.656199 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:08:43 crc kubenswrapper[4546]: I0201 08:08:43.655635 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:08:43 crc kubenswrapper[4546]: E0201 08:08:43.656844 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:08:56 crc kubenswrapper[4546]: I0201 08:08:56.654505 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:08:56 crc kubenswrapper[4546]: E0201 08:08:56.655577 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:09:11 crc kubenswrapper[4546]: I0201 08:09:11.655327 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:09:11 crc kubenswrapper[4546]: E0201 08:09:11.656697 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:09:25 crc kubenswrapper[4546]: I0201 08:09:25.654988 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:09:25 crc kubenswrapper[4546]: E0201 08:09:25.656185 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:09:36 crc kubenswrapper[4546]: I0201 08:09:36.654882 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:09:36 crc kubenswrapper[4546]: E0201 08:09:36.655511 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:09:48 crc kubenswrapper[4546]: I0201 08:09:48.655606 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:09:48 crc kubenswrapper[4546]: E0201 08:09:48.656834 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:09:59 crc kubenswrapper[4546]: I0201 08:09:59.663059 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:09:59 crc kubenswrapper[4546]: E0201 08:09:59.664317 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:10:13 crc kubenswrapper[4546]: I0201 08:10:13.655542 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:10:13 crc kubenswrapper[4546]: E0201 08:10:13.656674 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:10:26 crc kubenswrapper[4546]: I0201 08:10:26.654987 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:10:26 crc kubenswrapper[4546]: E0201 08:10:26.655846 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.225632 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:36 crc kubenswrapper[4546]: E0201 08:10:36.226826 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-api" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.226841 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-api" Feb 01 08:10:36 crc kubenswrapper[4546]: E0201 08:10:36.226883 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-httpd" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.226891 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-httpd" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.227144 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-api" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.227161 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c4ef4a8-3d36-4231-a2e8-06f510d1f3e6" containerName="neutron-httpd" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.228627 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.248344 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.276073 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.276739 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltnzd\" (UniqueName: \"kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.276898 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.379213 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltnzd\" (UniqueName: \"kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.379268 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.379340 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.379916 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.380109 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.403451 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltnzd\" (UniqueName: \"kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd\") pod \"certified-operators-9m8qf\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:36 crc kubenswrapper[4546]: I0201 08:10:36.548850 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:37 crc kubenswrapper[4546]: I0201 08:10:37.002617 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:37 crc kubenswrapper[4546]: I0201 08:10:37.863131 4546 generic.go:334] "Generic (PLEG): container finished" podID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerID="d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8" exitCode=0 Feb 01 08:10:37 crc kubenswrapper[4546]: I0201 08:10:37.863242 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerDied","Data":"d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8"} Feb 01 08:10:37 crc kubenswrapper[4546]: I0201 08:10:37.863482 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerStarted","Data":"6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af"} Feb 01 08:10:37 crc kubenswrapper[4546]: I0201 08:10:37.865101 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:10:38 crc kubenswrapper[4546]: I0201 08:10:38.654358 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:10:38 crc kubenswrapper[4546]: E0201 08:10:38.654572 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:10:38 crc kubenswrapper[4546]: I0201 08:10:38.884199 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerStarted","Data":"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8"} Feb 01 08:10:39 crc kubenswrapper[4546]: I0201 08:10:39.897102 4546 generic.go:334] "Generic (PLEG): container finished" podID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerID="53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8" exitCode=0 Feb 01 08:10:39 crc kubenswrapper[4546]: I0201 08:10:39.897227 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerDied","Data":"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8"} Feb 01 08:10:40 crc kubenswrapper[4546]: I0201 08:10:40.909057 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerStarted","Data":"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a"} Feb 01 08:10:46 crc kubenswrapper[4546]: I0201 08:10:46.549348 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:46 crc kubenswrapper[4546]: I0201 08:10:46.550088 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:46 crc kubenswrapper[4546]: I0201 08:10:46.596954 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:46 crc kubenswrapper[4546]: I0201 08:10:46.621258 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9m8qf" podStartSLOduration=8.134472308 podStartE2EDuration="10.621237918s" podCreationTimestamp="2026-02-01 08:10:36 +0000 UTC" firstStartedPulling="2026-02-01 08:10:37.864811452 +0000 UTC m=+5268.515747469" lastFinishedPulling="2026-02-01 08:10:40.351577063 +0000 UTC m=+5271.002513079" observedRunningTime="2026-02-01 08:10:40.933383249 +0000 UTC m=+5271.584319265" watchObservedRunningTime="2026-02-01 08:10:46.621237918 +0000 UTC m=+5277.272173934" Feb 01 08:10:47 crc kubenswrapper[4546]: I0201 08:10:47.019805 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:47 crc kubenswrapper[4546]: I0201 08:10:47.085340 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:48 crc kubenswrapper[4546]: I0201 08:10:48.992836 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9m8qf" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="registry-server" containerID="cri-o://dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a" gracePeriod=2 Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.428596 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.469886 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities\") pod \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.470040 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content\") pod \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.470199 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltnzd\" (UniqueName: \"kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd\") pod \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\" (UID: \"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325\") " Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.470602 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities" (OuterVolumeSpecName: "utilities") pod "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" (UID: "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.471044 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.487084 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd" (OuterVolumeSpecName: "kube-api-access-ltnzd") pod "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" (UID: "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325"). InnerVolumeSpecName "kube-api-access-ltnzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.517734 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" (UID: "4c8e7d24-86f3-45b4-b3a6-5124f6b4c325"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.573239 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltnzd\" (UniqueName: \"kubernetes.io/projected/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-kube-api-access-ltnzd\") on node \"crc\" DevicePath \"\"" Feb 01 08:10:49 crc kubenswrapper[4546]: I0201 08:10:49.573274 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.003498 4546 generic.go:334] "Generic (PLEG): container finished" podID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerID="dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a" exitCode=0 Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.003873 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerDied","Data":"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a"} Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.003905 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9m8qf" event={"ID":"4c8e7d24-86f3-45b4-b3a6-5124f6b4c325","Type":"ContainerDied","Data":"6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af"} Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.003945 4546 scope.go:117] "RemoveContainer" containerID="dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.003981 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9m8qf" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.025909 4546 scope.go:117] "RemoveContainer" containerID="53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.043026 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.051738 4546 scope.go:117] "RemoveContainer" containerID="d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.054294 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9m8qf"] Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.091653 4546 scope.go:117] "RemoveContainer" containerID="dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a" Feb 01 08:10:50 crc kubenswrapper[4546]: E0201 08:10:50.092510 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a\": container with ID starting with dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a not found: ID does not exist" containerID="dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.092551 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a"} err="failed to get container status \"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a\": rpc error: code = NotFound desc = could not find container \"dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a\": container with ID starting with dc2066edcc1adf61649d62ec9050626ffe4e4bd034d25e50ff38d6afcea2077a not found: ID does not exist" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.092579 4546 scope.go:117] "RemoveContainer" containerID="53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8" Feb 01 08:10:50 crc kubenswrapper[4546]: E0201 08:10:50.093066 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8\": container with ID starting with 53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8 not found: ID does not exist" containerID="53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.093161 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8"} err="failed to get container status \"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8\": rpc error: code = NotFound desc = could not find container \"53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8\": container with ID starting with 53298743b51ba6f7b9da3bcdb8312a4177b76d16b0f7f9f4fabe5d730aa592c8 not found: ID does not exist" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.093238 4546 scope.go:117] "RemoveContainer" containerID="d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8" Feb 01 08:10:50 crc kubenswrapper[4546]: E0201 08:10:50.093693 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8\": container with ID starting with d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8 not found: ID does not exist" containerID="d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.093775 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8"} err="failed to get container status \"d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8\": rpc error: code = NotFound desc = could not find container \"d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8\": container with ID starting with d81958283940ab15d0430ed36aa9cadcecfdfc6ce4c8096f5b7fd489a4ed85b8 not found: ID does not exist" Feb 01 08:10:50 crc kubenswrapper[4546]: I0201 08:10:50.654825 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:10:50 crc kubenswrapper[4546]: E0201 08:10:50.655124 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:10:51 crc kubenswrapper[4546]: I0201 08:10:51.664316 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" path="/var/lib/kubelet/pods/4c8e7d24-86f3-45b4-b3a6-5124f6b4c325/volumes" Feb 01 08:10:52 crc kubenswrapper[4546]: E0201 08:10:52.442374 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:11:02 crc kubenswrapper[4546]: E0201 08:11:02.653051 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:11:02 crc kubenswrapper[4546]: I0201 08:11:02.655992 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:11:03 crc kubenswrapper[4546]: I0201 08:11:03.114653 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2"} Feb 01 08:11:12 crc kubenswrapper[4546]: E0201 08:11:12.879981 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:11:23 crc kubenswrapper[4546]: E0201 08:11:23.104400 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:11:33 crc kubenswrapper[4546]: E0201 08:11:33.317491 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:11:43 crc kubenswrapper[4546]: E0201 08:11:43.541509 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8e7d24_86f3_45b4_b3a6_5124f6b4c325.slice/crio-6202a5202e0c60fed3607537f69acb588b284e00b94ddd185c5b43a43fcfe3af\": RecentStats: unable to find data in memory cache]" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.761133 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:15 crc kubenswrapper[4546]: E0201 08:12:15.762251 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="extract-content" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.762268 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="extract-content" Feb 01 08:12:15 crc kubenswrapper[4546]: E0201 08:12:15.762285 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="registry-server" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.762291 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="registry-server" Feb 01 08:12:15 crc kubenswrapper[4546]: E0201 08:12:15.762320 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="extract-utilities" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.762327 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="extract-utilities" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.762529 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8e7d24-86f3-45b4-b3a6-5124f6b4c325" containerName="registry-server" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.764108 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.779932 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.829792 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tg7t\" (UniqueName: \"kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.830446 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.830787 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.933761 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.933923 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tg7t\" (UniqueName: \"kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.933991 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.934520 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.934589 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:15 crc kubenswrapper[4546]: I0201 08:12:15.956956 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tg7t\" (UniqueName: \"kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t\") pod \"community-operators-lw45k\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:16 crc kubenswrapper[4546]: I0201 08:12:16.116544 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:16 crc kubenswrapper[4546]: I0201 08:12:16.719303 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:16 crc kubenswrapper[4546]: I0201 08:12:16.856936 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerStarted","Data":"0bf2480a39c27c0d85fee69ab2b853b5c7805d214e9cc98a9ac48e767397d0fd"} Feb 01 08:12:17 crc kubenswrapper[4546]: I0201 08:12:17.865784 4546 generic.go:334] "Generic (PLEG): container finished" podID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerID="caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68" exitCode=0 Feb 01 08:12:17 crc kubenswrapper[4546]: I0201 08:12:17.865870 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerDied","Data":"caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68"} Feb 01 08:12:18 crc kubenswrapper[4546]: I0201 08:12:18.880011 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerStarted","Data":"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af"} Feb 01 08:12:19 crc kubenswrapper[4546]: I0201 08:12:19.892093 4546 generic.go:334] "Generic (PLEG): container finished" podID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerID="05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af" exitCode=0 Feb 01 08:12:19 crc kubenswrapper[4546]: I0201 08:12:19.892846 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerDied","Data":"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af"} Feb 01 08:12:20 crc kubenswrapper[4546]: I0201 08:12:20.901516 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerStarted","Data":"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5"} Feb 01 08:12:20 crc kubenswrapper[4546]: I0201 08:12:20.918541 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lw45k" podStartSLOduration=3.411195853 podStartE2EDuration="5.918529472s" podCreationTimestamp="2026-02-01 08:12:15 +0000 UTC" firstStartedPulling="2026-02-01 08:12:17.86796188 +0000 UTC m=+5368.518897897" lastFinishedPulling="2026-02-01 08:12:20.3752955 +0000 UTC m=+5371.026231516" observedRunningTime="2026-02-01 08:12:20.914842681 +0000 UTC m=+5371.565778696" watchObservedRunningTime="2026-02-01 08:12:20.918529472 +0000 UTC m=+5371.569465488" Feb 01 08:12:26 crc kubenswrapper[4546]: I0201 08:12:26.117390 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:26 crc kubenswrapper[4546]: I0201 08:12:26.117916 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:26 crc kubenswrapper[4546]: I0201 08:12:26.154731 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:26 crc kubenswrapper[4546]: I0201 08:12:26.982752 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:27 crc kubenswrapper[4546]: I0201 08:12:27.023740 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:28 crc kubenswrapper[4546]: I0201 08:12:28.962427 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lw45k" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="registry-server" containerID="cri-o://ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5" gracePeriod=2 Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.450648 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.578592 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tg7t\" (UniqueName: \"kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t\") pod \"2de9de5e-6900-425c-bea9-00556fd57cd9\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.578799 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities\") pod \"2de9de5e-6900-425c-bea9-00556fd57cd9\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.578983 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content\") pod \"2de9de5e-6900-425c-bea9-00556fd57cd9\" (UID: \"2de9de5e-6900-425c-bea9-00556fd57cd9\") " Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.579458 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities" (OuterVolumeSpecName: "utilities") pod "2de9de5e-6900-425c-bea9-00556fd57cd9" (UID: "2de9de5e-6900-425c-bea9-00556fd57cd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.579877 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.586426 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t" (OuterVolumeSpecName: "kube-api-access-7tg7t") pod "2de9de5e-6900-425c-bea9-00556fd57cd9" (UID: "2de9de5e-6900-425c-bea9-00556fd57cd9"). InnerVolumeSpecName "kube-api-access-7tg7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.620894 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2de9de5e-6900-425c-bea9-00556fd57cd9" (UID: "2de9de5e-6900-425c-bea9-00556fd57cd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.682140 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2de9de5e-6900-425c-bea9-00556fd57cd9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.682169 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tg7t\" (UniqueName: \"kubernetes.io/projected/2de9de5e-6900-425c-bea9-00556fd57cd9-kube-api-access-7tg7t\") on node \"crc\" DevicePath \"\"" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.972751 4546 generic.go:334] "Generic (PLEG): container finished" podID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerID="ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5" exitCode=0 Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.972803 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lw45k" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.972819 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerDied","Data":"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5"} Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.972894 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lw45k" event={"ID":"2de9de5e-6900-425c-bea9-00556fd57cd9","Type":"ContainerDied","Data":"0bf2480a39c27c0d85fee69ab2b853b5c7805d214e9cc98a9ac48e767397d0fd"} Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.972918 4546 scope.go:117] "RemoveContainer" containerID="ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.994510 4546 scope.go:117] "RemoveContainer" containerID="05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af" Feb 01 08:12:29 crc kubenswrapper[4546]: I0201 08:12:29.998183 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.008556 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lw45k"] Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.014931 4546 scope.go:117] "RemoveContainer" containerID="caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.047272 4546 scope.go:117] "RemoveContainer" containerID="ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5" Feb 01 08:12:30 crc kubenswrapper[4546]: E0201 08:12:30.047726 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5\": container with ID starting with ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5 not found: ID does not exist" containerID="ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.047768 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5"} err="failed to get container status \"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5\": rpc error: code = NotFound desc = could not find container \"ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5\": container with ID starting with ca592e1c5406f8c74c4495cf3202497ce4b888acf9ea0a2b7d356295ec4711e5 not found: ID does not exist" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.047797 4546 scope.go:117] "RemoveContainer" containerID="05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af" Feb 01 08:12:30 crc kubenswrapper[4546]: E0201 08:12:30.048104 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af\": container with ID starting with 05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af not found: ID does not exist" containerID="05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.048131 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af"} err="failed to get container status \"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af\": rpc error: code = NotFound desc = could not find container \"05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af\": container with ID starting with 05b53b2bd483a103f891df935cb90626330e4a9f5413038c9f556b4651bdc5af not found: ID does not exist" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.048147 4546 scope.go:117] "RemoveContainer" containerID="caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68" Feb 01 08:12:30 crc kubenswrapper[4546]: E0201 08:12:30.048442 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68\": container with ID starting with caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68 not found: ID does not exist" containerID="caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68" Feb 01 08:12:30 crc kubenswrapper[4546]: I0201 08:12:30.048469 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68"} err="failed to get container status \"caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68\": rpc error: code = NotFound desc = could not find container \"caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68\": container with ID starting with caa6de9f36d91e0ac310476ef845d82b781f461c8a285ac997f5272e006d0d68 not found: ID does not exist" Feb 01 08:12:31 crc kubenswrapper[4546]: I0201 08:12:31.664305 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" path="/var/lib/kubelet/pods/2de9de5e-6900-425c-bea9-00556fd57cd9/volumes" Feb 01 08:13:25 crc kubenswrapper[4546]: I0201 08:13:25.420354 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:13:25 crc kubenswrapper[4546]: I0201 08:13:25.420769 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:13:55 crc kubenswrapper[4546]: I0201 08:13:55.421115 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:13:55 crc kubenswrapper[4546]: I0201 08:13:55.422374 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.421501 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.422916 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.423019 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.424350 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.424439 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2" gracePeriod=600 Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.946100 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2" exitCode=0 Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.946208 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2"} Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.946761 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268"} Feb 01 08:14:25 crc kubenswrapper[4546]: I0201 08:14:25.946804 4546 scope.go:117] "RemoveContainer" containerID="1d73a613fef3a9a592e0b31f7f17b9c1a73e18fe68113e14acdcca96582f98c8" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.169134 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j"] Feb 01 08:15:00 crc kubenswrapper[4546]: E0201 08:15:00.170119 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="extract-content" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.170132 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="extract-content" Feb 01 08:15:00 crc kubenswrapper[4546]: E0201 08:15:00.170158 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.170164 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[4546]: E0201 08:15:00.170189 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="extract-utilities" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.170195 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="extract-utilities" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.170367 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de9de5e-6900-425c-bea9-00556fd57cd9" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.171052 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.180398 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j"] Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.182282 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.232926 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.278772 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.278878 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gb6c\" (UniqueName: \"kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.278929 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.381012 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.381081 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gb6c\" (UniqueName: \"kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.381113 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.381793 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.394413 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.395553 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gb6c\" (UniqueName: \"kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c\") pod \"collect-profiles-29498895-6sz5j\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.489151 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:00 crc kubenswrapper[4546]: I0201 08:15:00.927304 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j"] Feb 01 08:15:01 crc kubenswrapper[4546]: I0201 08:15:01.247168 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" event={"ID":"5e154031-2652-43f7-8970-38bd3e61f165","Type":"ContainerStarted","Data":"721227fa12378e3bb777b268c4708b2140feba5007d4f41f47ca8dda57fe0a16"} Feb 01 08:15:01 crc kubenswrapper[4546]: I0201 08:15:01.247221 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" event={"ID":"5e154031-2652-43f7-8970-38bd3e61f165","Type":"ContainerStarted","Data":"9be656e5db7011e191947dd150fe2e3079d45fef04547a11827472f2cd8f58ca"} Feb 01 08:15:01 crc kubenswrapper[4546]: I0201 08:15:01.281292 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" podStartSLOduration=1.2812778329999999 podStartE2EDuration="1.281277833s" podCreationTimestamp="2026-02-01 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:15:01.277539775 +0000 UTC m=+5531.928475781" watchObservedRunningTime="2026-02-01 08:15:01.281277833 +0000 UTC m=+5531.932213848" Feb 01 08:15:02 crc kubenswrapper[4546]: I0201 08:15:02.256381 4546 generic.go:334] "Generic (PLEG): container finished" podID="5e154031-2652-43f7-8970-38bd3e61f165" containerID="721227fa12378e3bb777b268c4708b2140feba5007d4f41f47ca8dda57fe0a16" exitCode=0 Feb 01 08:15:02 crc kubenswrapper[4546]: I0201 08:15:02.256431 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" event={"ID":"5e154031-2652-43f7-8970-38bd3e61f165","Type":"ContainerDied","Data":"721227fa12378e3bb777b268c4708b2140feba5007d4f41f47ca8dda57fe0a16"} Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.687031 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.763331 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume\") pod \"5e154031-2652-43f7-8970-38bd3e61f165\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.763671 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume\") pod \"5e154031-2652-43f7-8970-38bd3e61f165\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.763763 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e154031-2652-43f7-8970-38bd3e61f165" (UID: "5e154031-2652-43f7-8970-38bd3e61f165"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.764838 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gb6c\" (UniqueName: \"kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c\") pod \"5e154031-2652-43f7-8970-38bd3e61f165\" (UID: \"5e154031-2652-43f7-8970-38bd3e61f165\") " Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.766067 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e154031-2652-43f7-8970-38bd3e61f165-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.771089 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c" (OuterVolumeSpecName: "kube-api-access-7gb6c") pod "5e154031-2652-43f7-8970-38bd3e61f165" (UID: "5e154031-2652-43f7-8970-38bd3e61f165"). InnerVolumeSpecName "kube-api-access-7gb6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.771251 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e154031-2652-43f7-8970-38bd3e61f165" (UID: "5e154031-2652-43f7-8970-38bd3e61f165"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.868273 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e154031-2652-43f7-8970-38bd3e61f165-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:03 crc kubenswrapper[4546]: I0201 08:15:03.868394 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gb6c\" (UniqueName: \"kubernetes.io/projected/5e154031-2652-43f7-8970-38bd3e61f165-kube-api-access-7gb6c\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.278339 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" event={"ID":"5e154031-2652-43f7-8970-38bd3e61f165","Type":"ContainerDied","Data":"9be656e5db7011e191947dd150fe2e3079d45fef04547a11827472f2cd8f58ca"} Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.278761 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9be656e5db7011e191947dd150fe2e3079d45fef04547a11827472f2cd8f58ca" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.278442 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.454418 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:04 crc kubenswrapper[4546]: E0201 08:15:04.455218 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e154031-2652-43f7-8970-38bd3e61f165" containerName="collect-profiles" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.455244 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e154031-2652-43f7-8970-38bd3e61f165" containerName="collect-profiles" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.455635 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e154031-2652-43f7-8970-38bd3e61f165" containerName="collect-profiles" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.457528 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.467144 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.480218 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59qvg\" (UniqueName: \"kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.480280 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.480334 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.582162 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59qvg\" (UniqueName: \"kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.582580 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.582613 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.583177 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.583281 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.601938 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59qvg\" (UniqueName: \"kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg\") pod \"redhat-operators-nfwdn\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.783075 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.783877 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc"] Feb 01 08:15:04 crc kubenswrapper[4546]: I0201 08:15:04.789572 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-lfmzc"] Feb 01 08:15:05 crc kubenswrapper[4546]: I0201 08:15:05.221947 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:05 crc kubenswrapper[4546]: I0201 08:15:05.291869 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerStarted","Data":"136ad8993933784b9ee9d6aab657f227cd2581e91549fec68d068551928cfe18"} Feb 01 08:15:05 crc kubenswrapper[4546]: I0201 08:15:05.663272 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72617a1-043e-418e-906c-41c594b4708c" path="/var/lib/kubelet/pods/d72617a1-043e-418e-906c-41c594b4708c/volumes" Feb 01 08:15:06 crc kubenswrapper[4546]: I0201 08:15:06.308527 4546 generic.go:334] "Generic (PLEG): container finished" podID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerID="41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939" exitCode=0 Feb 01 08:15:06 crc kubenswrapper[4546]: I0201 08:15:06.308589 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerDied","Data":"41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939"} Feb 01 08:15:07 crc kubenswrapper[4546]: I0201 08:15:07.317635 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerStarted","Data":"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1"} Feb 01 08:15:10 crc kubenswrapper[4546]: I0201 08:15:10.364600 4546 generic.go:334] "Generic (PLEG): container finished" podID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerID="c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1" exitCode=0 Feb 01 08:15:10 crc kubenswrapper[4546]: I0201 08:15:10.365207 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerDied","Data":"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1"} Feb 01 08:15:11 crc kubenswrapper[4546]: I0201 08:15:11.375848 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerStarted","Data":"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd"} Feb 01 08:15:11 crc kubenswrapper[4546]: I0201 08:15:11.395810 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nfwdn" podStartSLOduration=2.777165939 podStartE2EDuration="7.39578753s" podCreationTimestamp="2026-02-01 08:15:04 +0000 UTC" firstStartedPulling="2026-02-01 08:15:06.31210631 +0000 UTC m=+5536.963042326" lastFinishedPulling="2026-02-01 08:15:10.930727901 +0000 UTC m=+5541.581663917" observedRunningTime="2026-02-01 08:15:11.393513893 +0000 UTC m=+5542.044449909" watchObservedRunningTime="2026-02-01 08:15:11.39578753 +0000 UTC m=+5542.046723546" Feb 01 08:15:14 crc kubenswrapper[4546]: I0201 08:15:14.783823 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:14 crc kubenswrapper[4546]: I0201 08:15:14.784098 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:15 crc kubenswrapper[4546]: I0201 08:15:15.822381 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nfwdn" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" probeResult="failure" output=< Feb 01 08:15:15 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:15:15 crc kubenswrapper[4546]: > Feb 01 08:15:25 crc kubenswrapper[4546]: I0201 08:15:25.819102 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nfwdn" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" probeResult="failure" output=< Feb 01 08:15:25 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:15:25 crc kubenswrapper[4546]: > Feb 01 08:15:34 crc kubenswrapper[4546]: I0201 08:15:34.820747 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:34 crc kubenswrapper[4546]: I0201 08:15:34.860336 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:35 crc kubenswrapper[4546]: I0201 08:15:35.644017 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:36 crc kubenswrapper[4546]: I0201 08:15:36.613228 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nfwdn" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" containerID="cri-o://a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd" gracePeriod=2 Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.085676 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.220295 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content\") pod \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.220377 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59qvg\" (UniqueName: \"kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg\") pod \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.220418 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities\") pod \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\" (UID: \"a9b8f603-5e7e-40c4-95ff-525b45a9f539\") " Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.221358 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities" (OuterVolumeSpecName: "utilities") pod "a9b8f603-5e7e-40c4-95ff-525b45a9f539" (UID: "a9b8f603-5e7e-40c4-95ff-525b45a9f539"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.221556 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.229978 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg" (OuterVolumeSpecName: "kube-api-access-59qvg") pod "a9b8f603-5e7e-40c4-95ff-525b45a9f539" (UID: "a9b8f603-5e7e-40c4-95ff-525b45a9f539"). InnerVolumeSpecName "kube-api-access-59qvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.312543 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9b8f603-5e7e-40c4-95ff-525b45a9f539" (UID: "a9b8f603-5e7e-40c4-95ff-525b45a9f539"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.324021 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9b8f603-5e7e-40c4-95ff-525b45a9f539-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.324050 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59qvg\" (UniqueName: \"kubernetes.io/projected/a9b8f603-5e7e-40c4-95ff-525b45a9f539-kube-api-access-59qvg\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.623030 4546 generic.go:334] "Generic (PLEG): container finished" podID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerID="a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd" exitCode=0 Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.623087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerDied","Data":"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd"} Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.623131 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nfwdn" event={"ID":"a9b8f603-5e7e-40c4-95ff-525b45a9f539","Type":"ContainerDied","Data":"136ad8993933784b9ee9d6aab657f227cd2581e91549fec68d068551928cfe18"} Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.623372 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nfwdn" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.623581 4546 scope.go:117] "RemoveContainer" containerID="a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.669994 4546 scope.go:117] "RemoveContainer" containerID="c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.774032 4546 scope.go:117] "RemoveContainer" containerID="41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.815556 4546 scope.go:117] "RemoveContainer" containerID="a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd" Feb 01 08:15:37 crc kubenswrapper[4546]: E0201 08:15:37.815998 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd\": container with ID starting with a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd not found: ID does not exist" containerID="a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.816032 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd"} err="failed to get container status \"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd\": rpc error: code = NotFound desc = could not find container \"a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd\": container with ID starting with a00b47af2f0979cbb22d6c2cb8b1c24211b2fb88001c1eee7b84dce7930911fd not found: ID does not exist" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.816052 4546 scope.go:117] "RemoveContainer" containerID="c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1" Feb 01 08:15:37 crc kubenswrapper[4546]: E0201 08:15:37.821290 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1\": container with ID starting with c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1 not found: ID does not exist" containerID="c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.821321 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1"} err="failed to get container status \"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1\": rpc error: code = NotFound desc = could not find container \"c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1\": container with ID starting with c76d4cc09df522a55c413c34ec09c61064e0ccd6d9d44751bd4fc09b44fc3fe1 not found: ID does not exist" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.821342 4546 scope.go:117] "RemoveContainer" containerID="41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939" Feb 01 08:15:37 crc kubenswrapper[4546]: E0201 08:15:37.821869 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939\": container with ID starting with 41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939 not found: ID does not exist" containerID="41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.821894 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939"} err="failed to get container status \"41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939\": rpc error: code = NotFound desc = could not find container \"41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939\": container with ID starting with 41d208ec8364f4bb5b534f711440fa82c87761470e2325f8c7b050468ac8a939 not found: ID does not exist" Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.869177 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:37 crc kubenswrapper[4546]: I0201 08:15:37.890085 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nfwdn"] Feb 01 08:15:39 crc kubenswrapper[4546]: I0201 08:15:39.665956 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" path="/var/lib/kubelet/pods/a9b8f603-5e7e-40c4-95ff-525b45a9f539/volumes" Feb 01 08:15:59 crc kubenswrapper[4546]: I0201 08:15:59.341594 4546 scope.go:117] "RemoveContainer" containerID="836ab4957a933756f4fd2270dd11e1ecb73e65abf0c6bf865774dc134c2f86f6" Feb 01 08:16:25 crc kubenswrapper[4546]: I0201 08:16:25.421157 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:16:25 crc kubenswrapper[4546]: I0201 08:16:25.421793 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.090102 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:16:47 crc kubenswrapper[4546]: E0201 08:16:47.090959 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="extract-content" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.090973 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="extract-content" Feb 01 08:16:47 crc kubenswrapper[4546]: E0201 08:16:47.090986 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="extract-utilities" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.090992 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="extract-utilities" Feb 01 08:16:47 crc kubenswrapper[4546]: E0201 08:16:47.091022 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.091029 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.091172 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b8f603-5e7e-40c4-95ff-525b45a9f539" containerName="registry-server" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.092297 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.100658 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.124314 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.124407 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.124456 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52tk\" (UniqueName: \"kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.225892 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.225983 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.226017 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w52tk\" (UniqueName: \"kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.226421 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.226631 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.246454 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52tk\" (UniqueName: \"kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk\") pod \"redhat-marketplace-8twpv\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.407808 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:47 crc kubenswrapper[4546]: I0201 08:16:47.880640 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:16:48 crc kubenswrapper[4546]: I0201 08:16:48.263727 4546 generic.go:334] "Generic (PLEG): container finished" podID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerID="9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55" exitCode=0 Feb 01 08:16:48 crc kubenswrapper[4546]: I0201 08:16:48.263821 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerDied","Data":"9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55"} Feb 01 08:16:48 crc kubenswrapper[4546]: I0201 08:16:48.264084 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerStarted","Data":"8e704828580958e46571862d36e319610b2a0500777aa49de8f079b3ae598d0d"} Feb 01 08:16:48 crc kubenswrapper[4546]: I0201 08:16:48.267638 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:16:49 crc kubenswrapper[4546]: I0201 08:16:49.275038 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerStarted","Data":"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516"} Feb 01 08:16:50 crc kubenswrapper[4546]: I0201 08:16:50.285483 4546 generic.go:334] "Generic (PLEG): container finished" podID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerID="1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516" exitCode=0 Feb 01 08:16:50 crc kubenswrapper[4546]: I0201 08:16:50.285547 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerDied","Data":"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516"} Feb 01 08:16:51 crc kubenswrapper[4546]: I0201 08:16:51.295914 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerStarted","Data":"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325"} Feb 01 08:16:51 crc kubenswrapper[4546]: I0201 08:16:51.314285 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8twpv" podStartSLOduration=1.839877636 podStartE2EDuration="4.314265647s" podCreationTimestamp="2026-02-01 08:16:47 +0000 UTC" firstStartedPulling="2026-02-01 08:16:48.26598158 +0000 UTC m=+5638.916917596" lastFinishedPulling="2026-02-01 08:16:50.74036959 +0000 UTC m=+5641.391305607" observedRunningTime="2026-02-01 08:16:51.312473818 +0000 UTC m=+5641.963409834" watchObservedRunningTime="2026-02-01 08:16:51.314265647 +0000 UTC m=+5641.965201663" Feb 01 08:16:55 crc kubenswrapper[4546]: I0201 08:16:55.420602 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:16:55 crc kubenswrapper[4546]: I0201 08:16:55.421024 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:16:57 crc kubenswrapper[4546]: I0201 08:16:57.408249 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:57 crc kubenswrapper[4546]: I0201 08:16:57.408467 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:57 crc kubenswrapper[4546]: I0201 08:16:57.442979 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:58 crc kubenswrapper[4546]: I0201 08:16:58.376717 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:16:58 crc kubenswrapper[4546]: I0201 08:16:58.415236 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.356200 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8twpv" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="registry-server" containerID="cri-o://def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325" gracePeriod=2 Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.829448 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.953215 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities\") pod \"7afaafe7-61b9-453e-9698-3511ce51a1c2\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.953411 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content\") pod \"7afaafe7-61b9-453e-9698-3511ce51a1c2\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.953526 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w52tk\" (UniqueName: \"kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk\") pod \"7afaafe7-61b9-453e-9698-3511ce51a1c2\" (UID: \"7afaafe7-61b9-453e-9698-3511ce51a1c2\") " Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.953840 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities" (OuterVolumeSpecName: "utilities") pod "7afaafe7-61b9-453e-9698-3511ce51a1c2" (UID: "7afaafe7-61b9-453e-9698-3511ce51a1c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.954000 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.958768 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk" (OuterVolumeSpecName: "kube-api-access-w52tk") pod "7afaafe7-61b9-453e-9698-3511ce51a1c2" (UID: "7afaafe7-61b9-453e-9698-3511ce51a1c2"). InnerVolumeSpecName "kube-api-access-w52tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:17:00 crc kubenswrapper[4546]: I0201 08:17:00.969421 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7afaafe7-61b9-453e-9698-3511ce51a1c2" (UID: "7afaafe7-61b9-453e-9698-3511ce51a1c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.055749 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7afaafe7-61b9-453e-9698-3511ce51a1c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.055778 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w52tk\" (UniqueName: \"kubernetes.io/projected/7afaafe7-61b9-453e-9698-3511ce51a1c2-kube-api-access-w52tk\") on node \"crc\" DevicePath \"\"" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.363985 4546 generic.go:334] "Generic (PLEG): container finished" podID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerID="def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325" exitCode=0 Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.364028 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerDied","Data":"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325"} Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.364085 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8twpv" event={"ID":"7afaafe7-61b9-453e-9698-3511ce51a1c2","Type":"ContainerDied","Data":"8e704828580958e46571862d36e319610b2a0500777aa49de8f079b3ae598d0d"} Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.364107 4546 scope.go:117] "RemoveContainer" containerID="def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.364040 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8twpv" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.384566 4546 scope.go:117] "RemoveContainer" containerID="1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.394513 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.403015 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8twpv"] Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.426881 4546 scope.go:117] "RemoveContainer" containerID="9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.446148 4546 scope.go:117] "RemoveContainer" containerID="def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325" Feb 01 08:17:01 crc kubenswrapper[4546]: E0201 08:17:01.446761 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325\": container with ID starting with def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325 not found: ID does not exist" containerID="def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.446800 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325"} err="failed to get container status \"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325\": rpc error: code = NotFound desc = could not find container \"def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325\": container with ID starting with def4d3eceafae35e25f3e8485de714bd2a0f5bd575020d12ccf2044459917325 not found: ID does not exist" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.446823 4546 scope.go:117] "RemoveContainer" containerID="1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516" Feb 01 08:17:01 crc kubenswrapper[4546]: E0201 08:17:01.447121 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516\": container with ID starting with 1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516 not found: ID does not exist" containerID="1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.447151 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516"} err="failed to get container status \"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516\": rpc error: code = NotFound desc = could not find container \"1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516\": container with ID starting with 1fd70808b5ebbb739ee41b6bc6d385b5daf7cc5048faa318507c4a99c43b4516 not found: ID does not exist" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.447168 4546 scope.go:117] "RemoveContainer" containerID="9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55" Feb 01 08:17:01 crc kubenswrapper[4546]: E0201 08:17:01.447499 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55\": container with ID starting with 9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55 not found: ID does not exist" containerID="9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.447533 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55"} err="failed to get container status \"9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55\": rpc error: code = NotFound desc = could not find container \"9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55\": container with ID starting with 9f1c2daf8a2110419f9a608af61eea47dc751889371c7bc836b2bf3b4ddc6f55 not found: ID does not exist" Feb 01 08:17:01 crc kubenswrapper[4546]: I0201 08:17:01.665739 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" path="/var/lib/kubelet/pods/7afaafe7-61b9-453e-9698-3511ce51a1c2/volumes" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.421055 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.421450 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.421493 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.422367 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.422422 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" gracePeriod=600 Feb 01 08:17:25 crc kubenswrapper[4546]: E0201 08:17:25.538086 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.547759 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" exitCode=0 Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.547794 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268"} Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.547826 4546 scope.go:117] "RemoveContainer" containerID="7b1aa0cd05fbc7eb92e72bf3f4b425f898067f0e184912d1ba7ba0f29572dfa2" Feb 01 08:17:25 crc kubenswrapper[4546]: I0201 08:17:25.548277 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:17:25 crc kubenswrapper[4546]: E0201 08:17:25.548549 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:17:38 crc kubenswrapper[4546]: I0201 08:17:38.655233 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:17:38 crc kubenswrapper[4546]: E0201 08:17:38.656273 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:17:51 crc kubenswrapper[4546]: I0201 08:17:51.654647 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:17:51 crc kubenswrapper[4546]: E0201 08:17:51.656565 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:18:03 crc kubenswrapper[4546]: I0201 08:18:03.655324 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:18:03 crc kubenswrapper[4546]: E0201 08:18:03.656206 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:18:17 crc kubenswrapper[4546]: I0201 08:18:17.655190 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:18:17 crc kubenswrapper[4546]: E0201 08:18:17.655932 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:18:28 crc kubenswrapper[4546]: I0201 08:18:28.654893 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:18:28 crc kubenswrapper[4546]: E0201 08:18:28.656825 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:18:41 crc kubenswrapper[4546]: I0201 08:18:41.655127 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:18:41 crc kubenswrapper[4546]: E0201 08:18:41.656441 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:18:56 crc kubenswrapper[4546]: I0201 08:18:56.655756 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:18:56 crc kubenswrapper[4546]: E0201 08:18:56.657590 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:19:09 crc kubenswrapper[4546]: I0201 08:19:09.660522 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:19:09 crc kubenswrapper[4546]: E0201 08:19:09.661430 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:19:20 crc kubenswrapper[4546]: I0201 08:19:20.655403 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:19:20 crc kubenswrapper[4546]: E0201 08:19:20.656256 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:19:35 crc kubenswrapper[4546]: I0201 08:19:35.655088 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:19:35 crc kubenswrapper[4546]: E0201 08:19:35.655952 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:19:48 crc kubenswrapper[4546]: I0201 08:19:48.654797 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:19:48 crc kubenswrapper[4546]: E0201 08:19:48.655547 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:03 crc kubenswrapper[4546]: I0201 08:20:03.655579 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:20:03 crc kubenswrapper[4546]: E0201 08:20:03.657625 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:15 crc kubenswrapper[4546]: I0201 08:20:15.655580 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:20:15 crc kubenswrapper[4546]: E0201 08:20:15.657159 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:26 crc kubenswrapper[4546]: I0201 08:20:26.654970 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:20:26 crc kubenswrapper[4546]: E0201 08:20:26.655939 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:39 crc kubenswrapper[4546]: I0201 08:20:39.660293 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:20:39 crc kubenswrapper[4546]: E0201 08:20:39.661459 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.492070 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:40 crc kubenswrapper[4546]: E0201 08:20:40.492423 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="registry-server" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.492434 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="registry-server" Feb 01 08:20:40 crc kubenswrapper[4546]: E0201 08:20:40.492450 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="extract-content" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.492455 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="extract-content" Feb 01 08:20:40 crc kubenswrapper[4546]: E0201 08:20:40.492465 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="extract-utilities" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.492470 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="extract-utilities" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.492670 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afaafe7-61b9-453e-9698-3511ce51a1c2" containerName="registry-server" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.493821 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.512310 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.649161 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.649375 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrbd\" (UniqueName: \"kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.649499 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.751617 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.752063 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.752223 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrbd\" (UniqueName: \"kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.752279 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.752554 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.777670 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrbd\" (UniqueName: \"kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd\") pod \"certified-operators-65np9\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:40 crc kubenswrapper[4546]: I0201 08:20:40.809241 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:41 crc kubenswrapper[4546]: I0201 08:20:41.388192 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:42 crc kubenswrapper[4546]: I0201 08:20:42.380025 4546 generic.go:334] "Generic (PLEG): container finished" podID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerID="df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268" exitCode=0 Feb 01 08:20:42 crc kubenswrapper[4546]: I0201 08:20:42.381177 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerDied","Data":"df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268"} Feb 01 08:20:42 crc kubenswrapper[4546]: I0201 08:20:42.381269 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerStarted","Data":"bc050a586be0eef1d8f1cef6d613535bbecc87c8d68e03d9b496d8295abd6ae7"} Feb 01 08:20:43 crc kubenswrapper[4546]: I0201 08:20:43.394392 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerStarted","Data":"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd"} Feb 01 08:20:44 crc kubenswrapper[4546]: I0201 08:20:44.407095 4546 generic.go:334] "Generic (PLEG): container finished" podID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerID="75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd" exitCode=0 Feb 01 08:20:44 crc kubenswrapper[4546]: I0201 08:20:44.407179 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerDied","Data":"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd"} Feb 01 08:20:45 crc kubenswrapper[4546]: I0201 08:20:45.421087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerStarted","Data":"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3"} Feb 01 08:20:45 crc kubenswrapper[4546]: I0201 08:20:45.440575 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65np9" podStartSLOduration=2.8132455480000003 podStartE2EDuration="5.440559479s" podCreationTimestamp="2026-02-01 08:20:40 +0000 UTC" firstStartedPulling="2026-02-01 08:20:42.383747813 +0000 UTC m=+5873.034683818" lastFinishedPulling="2026-02-01 08:20:45.011061733 +0000 UTC m=+5875.661997749" observedRunningTime="2026-02-01 08:20:45.433609155 +0000 UTC m=+5876.084545170" watchObservedRunningTime="2026-02-01 08:20:45.440559479 +0000 UTC m=+5876.091495495" Feb 01 08:20:50 crc kubenswrapper[4546]: I0201 08:20:50.809711 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:50 crc kubenswrapper[4546]: I0201 08:20:50.810036 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:50 crc kubenswrapper[4546]: I0201 08:20:50.861318 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:51 crc kubenswrapper[4546]: I0201 08:20:51.510830 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:51 crc kubenswrapper[4546]: I0201 08:20:51.554008 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.490729 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-65np9" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="registry-server" containerID="cri-o://b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3" gracePeriod=2 Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.930967 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.966913 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwrbd\" (UniqueName: \"kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd\") pod \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.967034 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities\") pod \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.967162 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content\") pod \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\" (UID: \"ca501a33-5de6-4fa4-bbaf-cd4296954c72\") " Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.973673 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities" (OuterVolumeSpecName: "utilities") pod "ca501a33-5de6-4fa4-bbaf-cd4296954c72" (UID: "ca501a33-5de6-4fa4-bbaf-cd4296954c72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:20:53 crc kubenswrapper[4546]: I0201 08:20:53.976369 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd" (OuterVolumeSpecName: "kube-api-access-gwrbd") pod "ca501a33-5de6-4fa4-bbaf-cd4296954c72" (UID: "ca501a33-5de6-4fa4-bbaf-cd4296954c72"). InnerVolumeSpecName "kube-api-access-gwrbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.009014 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca501a33-5de6-4fa4-bbaf-cd4296954c72" (UID: "ca501a33-5de6-4fa4-bbaf-cd4296954c72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.069691 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.069728 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwrbd\" (UniqueName: \"kubernetes.io/projected/ca501a33-5de6-4fa4-bbaf-cd4296954c72-kube-api-access-gwrbd\") on node \"crc\" DevicePath \"\"" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.069738 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca501a33-5de6-4fa4-bbaf-cd4296954c72-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.500218 4546 generic.go:334] "Generic (PLEG): container finished" podID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerID="b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3" exitCode=0 Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.500318 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerDied","Data":"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3"} Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.500561 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65np9" event={"ID":"ca501a33-5de6-4fa4-bbaf-cd4296954c72","Type":"ContainerDied","Data":"bc050a586be0eef1d8f1cef6d613535bbecc87c8d68e03d9b496d8295abd6ae7"} Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.500590 4546 scope.go:117] "RemoveContainer" containerID="b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.500372 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65np9" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.528461 4546 scope.go:117] "RemoveContainer" containerID="75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.532293 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.541134 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-65np9"] Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.544289 4546 scope.go:117] "RemoveContainer" containerID="df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.581969 4546 scope.go:117] "RemoveContainer" containerID="b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3" Feb 01 08:20:54 crc kubenswrapper[4546]: E0201 08:20:54.582466 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3\": container with ID starting with b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3 not found: ID does not exist" containerID="b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.582498 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3"} err="failed to get container status \"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3\": rpc error: code = NotFound desc = could not find container \"b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3\": container with ID starting with b87a9b6fb3a5bfd6a1c434af6e7f0a7ac4960e7ecc7e1aeb441f19bf1d3e7cd3 not found: ID does not exist" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.582525 4546 scope.go:117] "RemoveContainer" containerID="75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd" Feb 01 08:20:54 crc kubenswrapper[4546]: E0201 08:20:54.582884 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd\": container with ID starting with 75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd not found: ID does not exist" containerID="75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.582910 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd"} err="failed to get container status \"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd\": rpc error: code = NotFound desc = could not find container \"75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd\": container with ID starting with 75c8a6a6f029d801a745d89324e845491ad791c1062c387bdf3fa11ff97900dd not found: ID does not exist" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.582927 4546 scope.go:117] "RemoveContainer" containerID="df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268" Feb 01 08:20:54 crc kubenswrapper[4546]: E0201 08:20:54.583165 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268\": container with ID starting with df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268 not found: ID does not exist" containerID="df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.583188 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268"} err="failed to get container status \"df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268\": rpc error: code = NotFound desc = could not find container \"df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268\": container with ID starting with df984542fe727431b653c18dab9fc6c0d218fb8b75340bfbd56a5b3c7d202268 not found: ID does not exist" Feb 01 08:20:54 crc kubenswrapper[4546]: I0201 08:20:54.656490 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:20:54 crc kubenswrapper[4546]: E0201 08:20:54.656801 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:20:55 crc kubenswrapper[4546]: I0201 08:20:55.662547 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" path="/var/lib/kubelet/pods/ca501a33-5de6-4fa4-bbaf-cd4296954c72/volumes" Feb 01 08:21:08 crc kubenswrapper[4546]: I0201 08:21:08.655630 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:21:08 crc kubenswrapper[4546]: E0201 08:21:08.656492 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:21:21 crc kubenswrapper[4546]: I0201 08:21:21.656877 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:21:21 crc kubenswrapper[4546]: E0201 08:21:21.657653 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:21:35 crc kubenswrapper[4546]: I0201 08:21:35.655188 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:21:35 crc kubenswrapper[4546]: E0201 08:21:35.655954 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:21:50 crc kubenswrapper[4546]: I0201 08:21:50.655740 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:21:50 crc kubenswrapper[4546]: E0201 08:21:50.657019 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:22:03 crc kubenswrapper[4546]: I0201 08:22:03.655271 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:22:03 crc kubenswrapper[4546]: E0201 08:22:03.656011 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:22:15 crc kubenswrapper[4546]: I0201 08:22:15.654903 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:22:15 crc kubenswrapper[4546]: E0201 08:22:15.655492 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:22:28 crc kubenswrapper[4546]: I0201 08:22:28.654775 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:22:29 crc kubenswrapper[4546]: I0201 08:22:29.245457 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6"} Feb 01 08:24:38 crc kubenswrapper[4546]: E0201 08:24:38.495489 4546 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.196:48432->192.168.26.196:40843: read tcp 192.168.26.196:48432->192.168.26.196:40843: read: connection reset by peer Feb 01 08:24:55 crc kubenswrapper[4546]: I0201 08:24:55.421386 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:24:55 crc kubenswrapper[4546]: I0201 08:24:55.422101 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.781538 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:12 crc kubenswrapper[4546]: E0201 08:25:12.784461 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="registry-server" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.784596 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="registry-server" Feb 01 08:25:12 crc kubenswrapper[4546]: E0201 08:25:12.784685 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="extract-utilities" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.784749 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="extract-utilities" Feb 01 08:25:12 crc kubenswrapper[4546]: E0201 08:25:12.784805 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="extract-content" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.784886 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="extract-content" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.785243 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca501a33-5de6-4fa4-bbaf-cd4296954c72" containerName="registry-server" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.787175 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.795082 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.883413 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.883565 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.883716 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkd2q\" (UniqueName: \"kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.985794 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkd2q\" (UniqueName: \"kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.985909 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.985970 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.986439 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:12 crc kubenswrapper[4546]: I0201 08:25:12.986752 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:13 crc kubenswrapper[4546]: I0201 08:25:13.020494 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkd2q\" (UniqueName: \"kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q\") pod \"redhat-operators-pbsld\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:13 crc kubenswrapper[4546]: I0201 08:25:13.104077 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:13 crc kubenswrapper[4546]: I0201 08:25:13.591486 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:14 crc kubenswrapper[4546]: I0201 08:25:14.559657 4546 generic.go:334] "Generic (PLEG): container finished" podID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerID="7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff" exitCode=0 Feb 01 08:25:14 crc kubenswrapper[4546]: I0201 08:25:14.559794 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerDied","Data":"7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff"} Feb 01 08:25:14 crc kubenswrapper[4546]: I0201 08:25:14.561898 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerStarted","Data":"ca208de7856c7a387bf9e8fa6f2de722fc0cc24f39928288e34fbdd95bcd1834"} Feb 01 08:25:14 crc kubenswrapper[4546]: I0201 08:25:14.561414 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:25:15 crc kubenswrapper[4546]: I0201 08:25:15.571236 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerStarted","Data":"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a"} Feb 01 08:25:18 crc kubenswrapper[4546]: I0201 08:25:18.595438 4546 generic.go:334] "Generic (PLEG): container finished" podID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerID="613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a" exitCode=0 Feb 01 08:25:18 crc kubenswrapper[4546]: I0201 08:25:18.595514 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerDied","Data":"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a"} Feb 01 08:25:19 crc kubenswrapper[4546]: I0201 08:25:19.605473 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerStarted","Data":"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6"} Feb 01 08:25:19 crc kubenswrapper[4546]: I0201 08:25:19.620109 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbsld" podStartSLOduration=3.071863336 podStartE2EDuration="7.620095096s" podCreationTimestamp="2026-02-01 08:25:12 +0000 UTC" firstStartedPulling="2026-02-01 08:25:14.561102271 +0000 UTC m=+6145.212038287" lastFinishedPulling="2026-02-01 08:25:19.109334032 +0000 UTC m=+6149.760270047" observedRunningTime="2026-02-01 08:25:19.618967901 +0000 UTC m=+6150.269903918" watchObservedRunningTime="2026-02-01 08:25:19.620095096 +0000 UTC m=+6150.271031113" Feb 01 08:25:23 crc kubenswrapper[4546]: I0201 08:25:23.104784 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:23 crc kubenswrapper[4546]: I0201 08:25:23.105443 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:24 crc kubenswrapper[4546]: I0201 08:25:24.152100 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbsld" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" probeResult="failure" output=< Feb 01 08:25:24 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:25:24 crc kubenswrapper[4546]: > Feb 01 08:25:25 crc kubenswrapper[4546]: I0201 08:25:25.420404 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:25:25 crc kubenswrapper[4546]: I0201 08:25:25.421175 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:25:34 crc kubenswrapper[4546]: I0201 08:25:34.159147 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbsld" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" probeResult="failure" output=< Feb 01 08:25:34 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:25:34 crc kubenswrapper[4546]: > Feb 01 08:25:43 crc kubenswrapper[4546]: I0201 08:25:43.220059 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:43 crc kubenswrapper[4546]: I0201 08:25:43.345287 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:43 crc kubenswrapper[4546]: I0201 08:25:43.987033 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:44 crc kubenswrapper[4546]: I0201 08:25:44.822528 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbsld" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" containerID="cri-o://01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6" gracePeriod=2 Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.493731 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.556406 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content\") pod \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.556559 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities\") pod \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.556650 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkd2q\" (UniqueName: \"kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q\") pod \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\" (UID: \"20d6d73d-fa17-4cc8-af70-23c61e9ae286\") " Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.557555 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities" (OuterVolumeSpecName: "utilities") pod "20d6d73d-fa17-4cc8-af70-23c61e9ae286" (UID: "20d6d73d-fa17-4cc8-af70-23c61e9ae286"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.571728 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q" (OuterVolumeSpecName: "kube-api-access-xkd2q") pod "20d6d73d-fa17-4cc8-af70-23c61e9ae286" (UID: "20d6d73d-fa17-4cc8-af70-23c61e9ae286"). InnerVolumeSpecName "kube-api-access-xkd2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.660392 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.660424 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkd2q\" (UniqueName: \"kubernetes.io/projected/20d6d73d-fa17-4cc8-af70-23c61e9ae286-kube-api-access-xkd2q\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.675557 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20d6d73d-fa17-4cc8-af70-23c61e9ae286" (UID: "20d6d73d-fa17-4cc8-af70-23c61e9ae286"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.764741 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20d6d73d-fa17-4cc8-af70-23c61e9ae286-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.854980 4546 generic.go:334] "Generic (PLEG): container finished" podID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerID="01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6" exitCode=0 Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.855027 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerDied","Data":"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6"} Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.855055 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbsld" event={"ID":"20d6d73d-fa17-4cc8-af70-23c61e9ae286","Type":"ContainerDied","Data":"ca208de7856c7a387bf9e8fa6f2de722fc0cc24f39928288e34fbdd95bcd1834"} Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.855074 4546 scope.go:117] "RemoveContainer" containerID="01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.855215 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbsld" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.903084 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.907152 4546 scope.go:117] "RemoveContainer" containerID="613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.912259 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbsld"] Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.953692 4546 scope.go:117] "RemoveContainer" containerID="7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.983265 4546 scope.go:117] "RemoveContainer" containerID="01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6" Feb 01 08:25:45 crc kubenswrapper[4546]: E0201 08:25:45.984582 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6\": container with ID starting with 01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6 not found: ID does not exist" containerID="01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.984619 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6"} err="failed to get container status \"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6\": rpc error: code = NotFound desc = could not find container \"01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6\": container with ID starting with 01ffd3667bb0798ca8b32f466696cbbce03908baebd311291007e4c868a148e6 not found: ID does not exist" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.984642 4546 scope.go:117] "RemoveContainer" containerID="613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a" Feb 01 08:25:45 crc kubenswrapper[4546]: E0201 08:25:45.985582 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a\": container with ID starting with 613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a not found: ID does not exist" containerID="613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.985607 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a"} err="failed to get container status \"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a\": rpc error: code = NotFound desc = could not find container \"613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a\": container with ID starting with 613900353f3ed672126ff4fe509506aa869d975c41757f09d09b7015e0290a3a not found: ID does not exist" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.985620 4546 scope.go:117] "RemoveContainer" containerID="7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff" Feb 01 08:25:45 crc kubenswrapper[4546]: E0201 08:25:45.986104 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff\": container with ID starting with 7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff not found: ID does not exist" containerID="7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff" Feb 01 08:25:45 crc kubenswrapper[4546]: I0201 08:25:45.986145 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff"} err="failed to get container status \"7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff\": rpc error: code = NotFound desc = could not find container \"7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff\": container with ID starting with 7c2a40a1913249d1b0ba70a7eafeb04af93859da48c52a38d04249eb00dd76ff not found: ID does not exist" Feb 01 08:25:47 crc kubenswrapper[4546]: I0201 08:25:47.663947 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" path="/var/lib/kubelet/pods/20d6d73d-fa17-4cc8-af70-23c61e9ae286/volumes" Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.420641 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.421178 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.421225 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.422678 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.422757 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6" gracePeriod=600 Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.964439 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6" exitCode=0 Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.964505 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6"} Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.964851 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa"} Feb 01 08:25:55 crc kubenswrapper[4546]: I0201 08:25:55.964895 4546 scope.go:117] "RemoveContainer" containerID="b1d3a0e61ae0a7ece856fee01cbd4b199485d59c3f588a1226bc7c9ef55ff268" Feb 01 08:27:55 crc kubenswrapper[4546]: I0201 08:27:55.420556 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:27:55 crc kubenswrapper[4546]: I0201 08:27:55.421466 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:28:25 crc kubenswrapper[4546]: I0201 08:28:25.421033 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:28:25 crc kubenswrapper[4546]: I0201 08:28:25.421819 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.420311 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.420851 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.420912 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.421692 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.421736 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" gracePeriod=600 Feb 01 08:28:55 crc kubenswrapper[4546]: E0201 08:28:55.555621 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.564928 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" exitCode=0 Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.565113 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa"} Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.565214 4546 scope.go:117] "RemoveContainer" containerID="5ea9a27553efaf37b6aa9196b22324a6045c968a3acccd76ff7ef21bbcdf4bc6" Feb 01 08:28:55 crc kubenswrapper[4546]: I0201 08:28:55.566059 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:28:55 crc kubenswrapper[4546]: E0201 08:28:55.566711 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:29:06 crc kubenswrapper[4546]: I0201 08:29:06.654435 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:29:06 crc kubenswrapper[4546]: E0201 08:29:06.655247 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:29:18 crc kubenswrapper[4546]: I0201 08:29:18.655790 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:29:18 crc kubenswrapper[4546]: E0201 08:29:18.657146 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:29:32 crc kubenswrapper[4546]: I0201 08:29:32.655811 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:29:32 crc kubenswrapper[4546]: E0201 08:29:32.657187 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:29:43 crc kubenswrapper[4546]: I0201 08:29:43.655236 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:29:43 crc kubenswrapper[4546]: E0201 08:29:43.656731 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:29:57 crc kubenswrapper[4546]: I0201 08:29:57.655170 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:29:57 crc kubenswrapper[4546]: E0201 08:29:57.656116 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.168600 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c"] Feb 01 08:30:00 crc kubenswrapper[4546]: E0201 08:30:00.169952 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.169969 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" Feb 01 08:30:00 crc kubenswrapper[4546]: E0201 08:30:00.169979 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="extract-utilities" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.169985 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="extract-utilities" Feb 01 08:30:00 crc kubenswrapper[4546]: E0201 08:30:00.170015 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="extract-content" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.170022 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="extract-content" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.170226 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d6d73d-fa17-4cc8-af70-23c61e9ae286" containerName="registry-server" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.172674 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.177596 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c"] Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.186568 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.189036 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.250475 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.250559 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.250652 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkswz\" (UniqueName: \"kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.353536 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkswz\" (UniqueName: \"kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.353900 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.354020 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.354982 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.365618 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.377634 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkswz\" (UniqueName: \"kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz\") pod \"collect-profiles-29498910-k9v6c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.498129 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:00 crc kubenswrapper[4546]: I0201 08:30:00.961814 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c"] Feb 01 08:30:01 crc kubenswrapper[4546]: I0201 08:30:01.123387 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" event={"ID":"d132734a-af40-40be-b663-d1cb789b819c","Type":"ContainerStarted","Data":"afb1a3e2af0e55b837d10d233198ded7598008a6b524bac6019c6b1aaeeef867"} Feb 01 08:30:01 crc kubenswrapper[4546]: I0201 08:30:01.149462 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" podStartSLOduration=1.149432796 podStartE2EDuration="1.149432796s" podCreationTimestamp="2026-02-01 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:30:01.138465589 +0000 UTC m=+6431.789401605" watchObservedRunningTime="2026-02-01 08:30:01.149432796 +0000 UTC m=+6431.800368812" Feb 01 08:30:02 crc kubenswrapper[4546]: I0201 08:30:02.132104 4546 generic.go:334] "Generic (PLEG): container finished" podID="d132734a-af40-40be-b663-d1cb789b819c" containerID="b21636b299343c2b9a5d47f63800c1f3274059713b01f9c9090c06903a6e7334" exitCode=0 Feb 01 08:30:02 crc kubenswrapper[4546]: I0201 08:30:02.132180 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" event={"ID":"d132734a-af40-40be-b663-d1cb789b819c","Type":"ContainerDied","Data":"b21636b299343c2b9a5d47f63800c1f3274059713b01f9c9090c06903a6e7334"} Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.491788 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.645848 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume\") pod \"d132734a-af40-40be-b663-d1cb789b819c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.646410 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume\") pod \"d132734a-af40-40be-b663-d1cb789b819c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.646464 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkswz\" (UniqueName: \"kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz\") pod \"d132734a-af40-40be-b663-d1cb789b819c\" (UID: \"d132734a-af40-40be-b663-d1cb789b819c\") " Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.646606 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d132734a-af40-40be-b663-d1cb789b819c" (UID: "d132734a-af40-40be-b663-d1cb789b819c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.647271 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d132734a-af40-40be-b663-d1cb789b819c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.653217 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz" (OuterVolumeSpecName: "kube-api-access-zkswz") pod "d132734a-af40-40be-b663-d1cb789b819c" (UID: "d132734a-af40-40be-b663-d1cb789b819c"). InnerVolumeSpecName "kube-api-access-zkswz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.653372 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d132734a-af40-40be-b663-d1cb789b819c" (UID: "d132734a-af40-40be-b663-d1cb789b819c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.751362 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d132734a-af40-40be-b663-d1cb789b819c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:03 crc kubenswrapper[4546]: I0201 08:30:03.751392 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkswz\" (UniqueName: \"kubernetes.io/projected/d132734a-af40-40be-b663-d1cb789b819c-kube-api-access-zkswz\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:04 crc kubenswrapper[4546]: I0201 08:30:04.155273 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" event={"ID":"d132734a-af40-40be-b663-d1cb789b819c","Type":"ContainerDied","Data":"afb1a3e2af0e55b837d10d233198ded7598008a6b524bac6019c6b1aaeeef867"} Feb 01 08:30:04 crc kubenswrapper[4546]: I0201 08:30:04.155607 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-k9v6c" Feb 01 08:30:04 crc kubenswrapper[4546]: I0201 08:30:04.155337 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb1a3e2af0e55b837d10d233198ded7598008a6b524bac6019c6b1aaeeef867" Feb 01 08:30:04 crc kubenswrapper[4546]: I0201 08:30:04.572998 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8"] Feb 01 08:30:04 crc kubenswrapper[4546]: I0201 08:30:04.579994 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-2pjh8"] Feb 01 08:30:05 crc kubenswrapper[4546]: I0201 08:30:05.667262 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe2eed6-dd1b-4865-8ee7-1c675edda8c8" path="/var/lib/kubelet/pods/abe2eed6-dd1b-4865-8ee7-1c675edda8c8/volumes" Feb 01 08:30:08 crc kubenswrapper[4546]: I0201 08:30:08.655645 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:30:08 crc kubenswrapper[4546]: E0201 08:30:08.656571 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:30:22 crc kubenswrapper[4546]: I0201 08:30:22.655282 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:30:22 crc kubenswrapper[4546]: E0201 08:30:22.656268 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:30:33 crc kubenswrapper[4546]: I0201 08:30:33.654805 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:30:33 crc kubenswrapper[4546]: E0201 08:30:33.655652 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:30:48 crc kubenswrapper[4546]: I0201 08:30:48.654819 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:30:48 crc kubenswrapper[4546]: E0201 08:30:48.655746 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:30:59 crc kubenswrapper[4546]: I0201 08:30:59.659480 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:30:59 crc kubenswrapper[4546]: I0201 08:30:59.660423 4546 scope.go:117] "RemoveContainer" containerID="483b4285388fa5265cef1048c28a1c54297a500a6341411692820a378c923b92" Feb 01 08:30:59 crc kubenswrapper[4546]: E0201 08:30:59.660652 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:31:11 crc kubenswrapper[4546]: I0201 08:31:11.658577 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:31:11 crc kubenswrapper[4546]: E0201 08:31:11.659921 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:31:23 crc kubenswrapper[4546]: I0201 08:31:23.655473 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:31:23 crc kubenswrapper[4546]: E0201 08:31:23.656557 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.448305 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:32 crc kubenswrapper[4546]: E0201 08:31:32.450307 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d132734a-af40-40be-b663-d1cb789b819c" containerName="collect-profiles" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.450480 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="d132734a-af40-40be-b663-d1cb789b819c" containerName="collect-profiles" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.450839 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="d132734a-af40-40be-b663-d1cb789b819c" containerName="collect-profiles" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.452327 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.458291 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.543760 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.544546 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.544685 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p829\" (UniqueName: \"kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.647381 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.647465 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p829\" (UniqueName: \"kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.647566 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.647989 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.648018 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.674772 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p829\" (UniqueName: \"kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829\") pod \"community-operators-h558w\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:32 crc kubenswrapper[4546]: I0201 08:31:32.789566 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:33 crc kubenswrapper[4546]: I0201 08:31:33.386543 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:33 crc kubenswrapper[4546]: I0201 08:31:33.926398 4546 generic.go:334] "Generic (PLEG): container finished" podID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerID="d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085" exitCode=0 Feb 01 08:31:33 crc kubenswrapper[4546]: I0201 08:31:33.926466 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerDied","Data":"d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085"} Feb 01 08:31:33 crc kubenswrapper[4546]: I0201 08:31:33.926554 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerStarted","Data":"d84ca26eb2dc534d72f3747ceb7b286f633f193dea6c4088d84e1f33f6fac4d6"} Feb 01 08:31:33 crc kubenswrapper[4546]: I0201 08:31:33.928142 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:31:34 crc kubenswrapper[4546]: I0201 08:31:34.655868 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:31:34 crc kubenswrapper[4546]: E0201 08:31:34.656542 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:31:34 crc kubenswrapper[4546]: I0201 08:31:34.946911 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerStarted","Data":"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747"} Feb 01 08:31:35 crc kubenswrapper[4546]: I0201 08:31:35.959427 4546 generic.go:334] "Generic (PLEG): container finished" podID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerID="ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747" exitCode=0 Feb 01 08:31:35 crc kubenswrapper[4546]: I0201 08:31:35.959476 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerDied","Data":"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747"} Feb 01 08:31:36 crc kubenswrapper[4546]: I0201 08:31:36.968367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerStarted","Data":"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748"} Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.004399 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h558w" podStartSLOduration=2.489620676 podStartE2EDuration="5.004381339s" podCreationTimestamp="2026-02-01 08:31:32 +0000 UTC" firstStartedPulling="2026-02-01 08:31:33.92789609 +0000 UTC m=+6524.578832107" lastFinishedPulling="2026-02-01 08:31:36.442656753 +0000 UTC m=+6527.093592770" observedRunningTime="2026-02-01 08:31:36.998297388 +0000 UTC m=+6527.649233394" watchObservedRunningTime="2026-02-01 08:31:37.004381339 +0000 UTC m=+6527.655317354" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.628784 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.632086 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.656761 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724fz\" (UniqueName: \"kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.656954 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.657033 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.669158 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.758018 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.758182 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-724fz\" (UniqueName: \"kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.758301 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.758940 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.759066 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.777895 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-724fz\" (UniqueName: \"kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz\") pod \"certified-operators-p7klj\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:37 crc kubenswrapper[4546]: I0201 08:31:37.950297 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.553312 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.651795 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.653846 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.665971 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.683051 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dq45\" (UniqueName: \"kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.683130 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.683250 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.784499 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.785074 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dq45\" (UniqueName: \"kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.785090 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.785133 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.785517 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.806606 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dq45\" (UniqueName: \"kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45\") pod \"redhat-marketplace-qmwgh\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:38 crc kubenswrapper[4546]: I0201 08:31:38.973584 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:39 crc kubenswrapper[4546]: I0201 08:31:39.023903 4546 generic.go:334] "Generic (PLEG): container finished" podID="db1269bd-d5fe-4188-b11b-143455adde95" containerID="45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e" exitCode=0 Feb 01 08:31:39 crc kubenswrapper[4546]: I0201 08:31:39.023955 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerDied","Data":"45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e"} Feb 01 08:31:39 crc kubenswrapper[4546]: I0201 08:31:39.023996 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerStarted","Data":"3db8bed6b9b75225f341777072d8570cd3215f18974e7775a7186a9b1d740f9d"} Feb 01 08:31:39 crc kubenswrapper[4546]: I0201 08:31:39.511679 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:40 crc kubenswrapper[4546]: I0201 08:31:40.041465 4546 generic.go:334] "Generic (PLEG): container finished" podID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerID="f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8" exitCode=0 Feb 01 08:31:40 crc kubenswrapper[4546]: I0201 08:31:40.041905 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerDied","Data":"f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8"} Feb 01 08:31:40 crc kubenswrapper[4546]: I0201 08:31:40.041956 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerStarted","Data":"f2b8a99490b78f14d12571c67cbdc0dd580d75632467a38c53288cd42d212251"} Feb 01 08:31:41 crc kubenswrapper[4546]: I0201 08:31:41.052788 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerStarted","Data":"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9"} Feb 01 08:31:41 crc kubenswrapper[4546]: I0201 08:31:41.055582 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerStarted","Data":"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041"} Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.073107 4546 generic.go:334] "Generic (PLEG): container finished" podID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerID="99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041" exitCode=0 Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.073224 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerDied","Data":"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041"} Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.076949 4546 generic.go:334] "Generic (PLEG): container finished" podID="db1269bd-d5fe-4188-b11b-143455adde95" containerID="c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9" exitCode=0 Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.077023 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerDied","Data":"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9"} Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.790463 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.790894 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:42 crc kubenswrapper[4546]: I0201 08:31:42.833107 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:43 crc kubenswrapper[4546]: I0201 08:31:43.086252 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerStarted","Data":"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c"} Feb 01 08:31:43 crc kubenswrapper[4546]: I0201 08:31:43.088694 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerStarted","Data":"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825"} Feb 01 08:31:43 crc kubenswrapper[4546]: I0201 08:31:43.111876 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qmwgh" podStartSLOduration=2.607144441 podStartE2EDuration="5.111838965s" podCreationTimestamp="2026-02-01 08:31:38 +0000 UTC" firstStartedPulling="2026-02-01 08:31:40.044569463 +0000 UTC m=+6530.695505478" lastFinishedPulling="2026-02-01 08:31:42.549263986 +0000 UTC m=+6533.200200002" observedRunningTime="2026-02-01 08:31:43.107361805 +0000 UTC m=+6533.758297820" watchObservedRunningTime="2026-02-01 08:31:43.111838965 +0000 UTC m=+6533.762774981" Feb 01 08:31:43 crc kubenswrapper[4546]: I0201 08:31:43.148391 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:43 crc kubenswrapper[4546]: I0201 08:31:43.168696 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7klj" podStartSLOduration=2.610538374 podStartE2EDuration="6.168667256s" podCreationTimestamp="2026-02-01 08:31:37 +0000 UTC" firstStartedPulling="2026-02-01 08:31:39.033216533 +0000 UTC m=+6529.684152549" lastFinishedPulling="2026-02-01 08:31:42.591345415 +0000 UTC m=+6533.242281431" observedRunningTime="2026-02-01 08:31:43.148130777 +0000 UTC m=+6533.799066803" watchObservedRunningTime="2026-02-01 08:31:43.168667256 +0000 UTC m=+6533.819603272" Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.422584 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.423144 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h558w" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="registry-server" containerID="cri-o://88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748" gracePeriod=2 Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.858940 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.926328 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content\") pod \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.926616 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p829\" (UniqueName: \"kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829\") pod \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.926757 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities\") pod \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\" (UID: \"4abdc78b-2eaa-4280-997e-cbc6f7081c11\") " Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.929771 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities" (OuterVolumeSpecName: "utilities") pod "4abdc78b-2eaa-4280-997e-cbc6f7081c11" (UID: "4abdc78b-2eaa-4280-997e-cbc6f7081c11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.931185 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.936926 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829" (OuterVolumeSpecName: "kube-api-access-6p829") pod "4abdc78b-2eaa-4280-997e-cbc6f7081c11" (UID: "4abdc78b-2eaa-4280-997e-cbc6f7081c11"). InnerVolumeSpecName "kube-api-access-6p829". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:31:46 crc kubenswrapper[4546]: I0201 08:31:46.975383 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4abdc78b-2eaa-4280-997e-cbc6f7081c11" (UID: "4abdc78b-2eaa-4280-997e-cbc6f7081c11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.033659 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abdc78b-2eaa-4280-997e-cbc6f7081c11-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.033694 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p829\" (UniqueName: \"kubernetes.io/projected/4abdc78b-2eaa-4280-997e-cbc6f7081c11-kube-api-access-6p829\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.127185 4546 generic.go:334] "Generic (PLEG): container finished" podID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerID="88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748" exitCode=0 Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.127254 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerDied","Data":"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748"} Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.127290 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h558w" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.127307 4546 scope.go:117] "RemoveContainer" containerID="88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.127294 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h558w" event={"ID":"4abdc78b-2eaa-4280-997e-cbc6f7081c11","Type":"ContainerDied","Data":"d84ca26eb2dc534d72f3747ceb7b286f633f193dea6c4088d84e1f33f6fac4d6"} Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.150418 4546 scope.go:117] "RemoveContainer" containerID="ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.163233 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.173043 4546 scope.go:117] "RemoveContainer" containerID="d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.174415 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h558w"] Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.206495 4546 scope.go:117] "RemoveContainer" containerID="88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748" Feb 01 08:31:47 crc kubenswrapper[4546]: E0201 08:31:47.206922 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748\": container with ID starting with 88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748 not found: ID does not exist" containerID="88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.206953 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748"} err="failed to get container status \"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748\": rpc error: code = NotFound desc = could not find container \"88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748\": container with ID starting with 88e9efaecc76d70524bd00eec81328be18b3a308539e1c7c750c5b2283efe748 not found: ID does not exist" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.206973 4546 scope.go:117] "RemoveContainer" containerID="ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747" Feb 01 08:31:47 crc kubenswrapper[4546]: E0201 08:31:47.207292 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747\": container with ID starting with ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747 not found: ID does not exist" containerID="ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.207323 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747"} err="failed to get container status \"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747\": rpc error: code = NotFound desc = could not find container \"ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747\": container with ID starting with ce18465b494d5b2a57ea5ffc18a5299644e98e5075fb7bfef483c04f9240d747 not found: ID does not exist" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.207337 4546 scope.go:117] "RemoveContainer" containerID="d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085" Feb 01 08:31:47 crc kubenswrapper[4546]: E0201 08:31:47.207982 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085\": container with ID starting with d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085 not found: ID does not exist" containerID="d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.208126 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085"} err="failed to get container status \"d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085\": rpc error: code = NotFound desc = could not find container \"d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085\": container with ID starting with d325cca20a6402b1c6826756c94b990fa3433e9e9e1e19e6a22412692f17d085 not found: ID does not exist" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.655918 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:31:47 crc kubenswrapper[4546]: E0201 08:31:47.656291 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.664982 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" path="/var/lib/kubelet/pods/4abdc78b-2eaa-4280-997e-cbc6f7081c11/volumes" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.951363 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.951421 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:47 crc kubenswrapper[4546]: I0201 08:31:47.994340 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:48 crc kubenswrapper[4546]: I0201 08:31:48.188235 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:48 crc kubenswrapper[4546]: I0201 08:31:48.821408 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:48 crc kubenswrapper[4546]: I0201 08:31:48.974123 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:48 crc kubenswrapper[4546]: I0201 08:31:48.974430 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:49 crc kubenswrapper[4546]: I0201 08:31:49.008892 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:49 crc kubenswrapper[4546]: I0201 08:31:49.193038 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.164439 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7klj" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="registry-server" containerID="cri-o://557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825" gracePeriod=2 Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.560362 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.616618 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities\") pod \"db1269bd-d5fe-4188-b11b-143455adde95\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.616717 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content\") pod \"db1269bd-d5fe-4188-b11b-143455adde95\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.616948 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-724fz\" (UniqueName: \"kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz\") pod \"db1269bd-d5fe-4188-b11b-143455adde95\" (UID: \"db1269bd-d5fe-4188-b11b-143455adde95\") " Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.617362 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities" (OuterVolumeSpecName: "utilities") pod "db1269bd-d5fe-4188-b11b-143455adde95" (UID: "db1269bd-d5fe-4188-b11b-143455adde95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.618640 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.623654 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz" (OuterVolumeSpecName: "kube-api-access-724fz") pod "db1269bd-d5fe-4188-b11b-143455adde95" (UID: "db1269bd-d5fe-4188-b11b-143455adde95"). InnerVolumeSpecName "kube-api-access-724fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.660045 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db1269bd-d5fe-4188-b11b-143455adde95" (UID: "db1269bd-d5fe-4188-b11b-143455adde95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.723520 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-724fz\" (UniqueName: \"kubernetes.io/projected/db1269bd-d5fe-4188-b11b-143455adde95-kube-api-access-724fz\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:50 crc kubenswrapper[4546]: I0201 08:31:50.723779 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db1269bd-d5fe-4188-b11b-143455adde95-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.174831 4546 generic.go:334] "Generic (PLEG): container finished" podID="db1269bd-d5fe-4188-b11b-143455adde95" containerID="557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825" exitCode=0 Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.174916 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7klj" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.174927 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerDied","Data":"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825"} Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.174992 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7klj" event={"ID":"db1269bd-d5fe-4188-b11b-143455adde95","Type":"ContainerDied","Data":"3db8bed6b9b75225f341777072d8570cd3215f18974e7775a7186a9b1d740f9d"} Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.175013 4546 scope.go:117] "RemoveContainer" containerID="557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.195686 4546 scope.go:117] "RemoveContainer" containerID="c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.209524 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.218723 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7klj"] Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.222953 4546 scope.go:117] "RemoveContainer" containerID="45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.252042 4546 scope.go:117] "RemoveContainer" containerID="557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825" Feb 01 08:31:51 crc kubenswrapper[4546]: E0201 08:31:51.252515 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825\": container with ID starting with 557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825 not found: ID does not exist" containerID="557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.252554 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825"} err="failed to get container status \"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825\": rpc error: code = NotFound desc = could not find container \"557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825\": container with ID starting with 557712ad3aa982c7d08e19f4d7cb8aa83bc3b6907ba46a5261377412ed093825 not found: ID does not exist" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.252577 4546 scope.go:117] "RemoveContainer" containerID="c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9" Feb 01 08:31:51 crc kubenswrapper[4546]: E0201 08:31:51.252841 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9\": container with ID starting with c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9 not found: ID does not exist" containerID="c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.252883 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9"} err="failed to get container status \"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9\": rpc error: code = NotFound desc = could not find container \"c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9\": container with ID starting with c622d8854a52c1c7d9676980efbb837bab93f1de30d983381c7673f55fefd6f9 not found: ID does not exist" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.252897 4546 scope.go:117] "RemoveContainer" containerID="45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e" Feb 01 08:31:51 crc kubenswrapper[4546]: E0201 08:31:51.253121 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e\": container with ID starting with 45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e not found: ID does not exist" containerID="45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.253144 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e"} err="failed to get container status \"45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e\": rpc error: code = NotFound desc = could not find container \"45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e\": container with ID starting with 45ce1b2e3f0b5ecfda5f811d8e446b321fa9da8e76c09e55ee70b6253664600e not found: ID does not exist" Feb 01 08:31:51 crc kubenswrapper[4546]: I0201 08:31:51.662686 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1269bd-d5fe-4188-b11b-143455adde95" path="/var/lib/kubelet/pods/db1269bd-d5fe-4188-b11b-143455adde95/volumes" Feb 01 08:31:55 crc kubenswrapper[4546]: I0201 08:31:55.822103 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:55 crc kubenswrapper[4546]: I0201 08:31:55.822911 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qmwgh" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="registry-server" containerID="cri-o://0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c" gracePeriod=2 Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.232972 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.240120 4546 generic.go:334] "Generic (PLEG): container finished" podID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerID="0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c" exitCode=0 Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.240158 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerDied","Data":"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c"} Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.240185 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmwgh" event={"ID":"6b7416a3-a306-446d-9ea2-ff2e52461aab","Type":"ContainerDied","Data":"f2b8a99490b78f14d12571c67cbdc0dd580d75632467a38c53288cd42d212251"} Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.240203 4546 scope.go:117] "RemoveContainer" containerID="0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.289134 4546 scope.go:117] "RemoveContainer" containerID="99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.306468 4546 scope.go:117] "RemoveContainer" containerID="f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.337566 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities\") pod \"6b7416a3-a306-446d-9ea2-ff2e52461aab\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.337716 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dq45\" (UniqueName: \"kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45\") pod \"6b7416a3-a306-446d-9ea2-ff2e52461aab\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.337842 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content\") pod \"6b7416a3-a306-446d-9ea2-ff2e52461aab\" (UID: \"6b7416a3-a306-446d-9ea2-ff2e52461aab\") " Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.338406 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities" (OuterVolumeSpecName: "utilities") pod "6b7416a3-a306-446d-9ea2-ff2e52461aab" (UID: "6b7416a3-a306-446d-9ea2-ff2e52461aab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.339074 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.344520 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45" (OuterVolumeSpecName: "kube-api-access-5dq45") pod "6b7416a3-a306-446d-9ea2-ff2e52461aab" (UID: "6b7416a3-a306-446d-9ea2-ff2e52461aab"). InnerVolumeSpecName "kube-api-access-5dq45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.345300 4546 scope.go:117] "RemoveContainer" containerID="0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c" Feb 01 08:31:56 crc kubenswrapper[4546]: E0201 08:31:56.345705 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c\": container with ID starting with 0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c not found: ID does not exist" containerID="0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.345808 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c"} err="failed to get container status \"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c\": rpc error: code = NotFound desc = could not find container \"0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c\": container with ID starting with 0a21b767a4e33eb01be693546202ba2d26ff6a90a21ceb1f19994e06e0de2e7c not found: ID does not exist" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.345970 4546 scope.go:117] "RemoveContainer" containerID="99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041" Feb 01 08:31:56 crc kubenswrapper[4546]: E0201 08:31:56.346308 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041\": container with ID starting with 99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041 not found: ID does not exist" containerID="99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.346412 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041"} err="failed to get container status \"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041\": rpc error: code = NotFound desc = could not find container \"99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041\": container with ID starting with 99e944d82a9516bc5878867436f92b7ed5fa19d2c11e0bcbe13973bbb9903041 not found: ID does not exist" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.346491 4546 scope.go:117] "RemoveContainer" containerID="f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8" Feb 01 08:31:56 crc kubenswrapper[4546]: E0201 08:31:56.347270 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8\": container with ID starting with f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8 not found: ID does not exist" containerID="f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.347356 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8"} err="failed to get container status \"f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8\": rpc error: code = NotFound desc = could not find container \"f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8\": container with ID starting with f8fa3ffd5f859700cb90a5ffac9b300471011221ca1dda34fdf656fe02469ff8 not found: ID does not exist" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.358729 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b7416a3-a306-446d-9ea2-ff2e52461aab" (UID: "6b7416a3-a306-446d-9ea2-ff2e52461aab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.441174 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b7416a3-a306-446d-9ea2-ff2e52461aab-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:56 crc kubenswrapper[4546]: I0201 08:31:56.441277 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dq45\" (UniqueName: \"kubernetes.io/projected/6b7416a3-a306-446d-9ea2-ff2e52461aab-kube-api-access-5dq45\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:57 crc kubenswrapper[4546]: I0201 08:31:57.250808 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmwgh" Feb 01 08:31:57 crc kubenswrapper[4546]: I0201 08:31:57.286658 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:57 crc kubenswrapper[4546]: I0201 08:31:57.294696 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmwgh"] Feb 01 08:31:57 crc kubenswrapper[4546]: I0201 08:31:57.664570 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" path="/var/lib/kubelet/pods/6b7416a3-a306-446d-9ea2-ff2e52461aab/volumes" Feb 01 08:32:02 crc kubenswrapper[4546]: I0201 08:32:02.654889 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:32:02 crc kubenswrapper[4546]: E0201 08:32:02.655886 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:32:13 crc kubenswrapper[4546]: I0201 08:32:13.655761 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:32:13 crc kubenswrapper[4546]: E0201 08:32:13.656577 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:32:24 crc kubenswrapper[4546]: I0201 08:32:24.655881 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:32:24 crc kubenswrapper[4546]: E0201 08:32:24.656772 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:32:39 crc kubenswrapper[4546]: I0201 08:32:39.660941 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:32:39 crc kubenswrapper[4546]: E0201 08:32:39.661830 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:32:53 crc kubenswrapper[4546]: I0201 08:32:53.655461 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:32:53 crc kubenswrapper[4546]: E0201 08:32:53.656510 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:33:05 crc kubenswrapper[4546]: I0201 08:33:05.655803 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:33:05 crc kubenswrapper[4546]: E0201 08:33:05.656894 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:33:17 crc kubenswrapper[4546]: I0201 08:33:17.654711 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:33:17 crc kubenswrapper[4546]: E0201 08:33:17.655522 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:33:29 crc kubenswrapper[4546]: I0201 08:33:29.660379 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:33:29 crc kubenswrapper[4546]: E0201 08:33:29.661424 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:33:44 crc kubenswrapper[4546]: I0201 08:33:44.655293 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:33:44 crc kubenswrapper[4546]: E0201 08:33:44.656641 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:33:59 crc kubenswrapper[4546]: I0201 08:33:59.659694 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:34:00 crc kubenswrapper[4546]: I0201 08:34:00.279123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6"} Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.828558 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829632 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829648 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829659 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829666 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829679 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829686 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829695 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829702 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829711 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829716 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829731 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829736 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="extract-content" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829758 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829763 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829775 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829781 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: E0201 08:35:17.829798 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.829804 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="extract-utilities" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.830053 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1269bd-d5fe-4188-b11b-143455adde95" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.830067 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7416a3-a306-446d-9ea2-ff2e52461aab" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.830086 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abdc78b-2eaa-4280-997e-cbc6f7081c11" containerName="registry-server" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.831779 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.857598 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.938771 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.938827 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6jxf\" (UniqueName: \"kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:17 crc kubenswrapper[4546]: I0201 08:35:17.939007 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.041460 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.041720 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.041762 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6jxf\" (UniqueName: \"kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.042468 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.042778 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.060670 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6jxf\" (UniqueName: \"kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf\") pod \"redhat-operators-fp5vd\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.149953 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.460115 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.931493 4546 generic.go:334] "Generic (PLEG): container finished" podID="397b829d-4a39-4387-971f-13c59e047664" containerID="1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e" exitCode=0 Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.931647 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerDied","Data":"1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e"} Feb 01 08:35:18 crc kubenswrapper[4546]: I0201 08:35:18.931874 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerStarted","Data":"3fcfe29f82b5d64029e62001dbe6fb432b68b67b37fb8548df5988f7075c1904"} Feb 01 08:35:19 crc kubenswrapper[4546]: I0201 08:35:19.946367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerStarted","Data":"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23"} Feb 01 08:35:22 crc kubenswrapper[4546]: I0201 08:35:22.974916 4546 generic.go:334] "Generic (PLEG): container finished" podID="397b829d-4a39-4387-971f-13c59e047664" containerID="91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23" exitCode=0 Feb 01 08:35:22 crc kubenswrapper[4546]: I0201 08:35:22.974963 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerDied","Data":"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23"} Feb 01 08:35:23 crc kubenswrapper[4546]: I0201 08:35:23.989386 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerStarted","Data":"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8"} Feb 01 08:35:24 crc kubenswrapper[4546]: I0201 08:35:24.009548 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fp5vd" podStartSLOduration=2.360765906 podStartE2EDuration="7.009528208s" podCreationTimestamp="2026-02-01 08:35:17 +0000 UTC" firstStartedPulling="2026-02-01 08:35:18.934598185 +0000 UTC m=+6749.585534200" lastFinishedPulling="2026-02-01 08:35:23.583360486 +0000 UTC m=+6754.234296502" observedRunningTime="2026-02-01 08:35:24.009200179 +0000 UTC m=+6754.660136196" watchObservedRunningTime="2026-02-01 08:35:24.009528208 +0000 UTC m=+6754.660464224" Feb 01 08:35:28 crc kubenswrapper[4546]: I0201 08:35:28.150963 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:28 crc kubenswrapper[4546]: I0201 08:35:28.151459 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:29 crc kubenswrapper[4546]: I0201 08:35:29.190925 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp5vd" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="registry-server" probeResult="failure" output=< Feb 01 08:35:29 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:35:29 crc kubenswrapper[4546]: > Feb 01 08:35:38 crc kubenswrapper[4546]: I0201 08:35:38.197424 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:38 crc kubenswrapper[4546]: I0201 08:35:38.242838 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:38 crc kubenswrapper[4546]: I0201 08:35:38.444034 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.131222 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fp5vd" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="registry-server" containerID="cri-o://eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8" gracePeriod=2 Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.755132 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.926242 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities\") pod \"397b829d-4a39-4387-971f-13c59e047664\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.926714 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6jxf\" (UniqueName: \"kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf\") pod \"397b829d-4a39-4387-971f-13c59e047664\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.926990 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content\") pod \"397b829d-4a39-4387-971f-13c59e047664\" (UID: \"397b829d-4a39-4387-971f-13c59e047664\") " Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.927576 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities" (OuterVolumeSpecName: "utilities") pod "397b829d-4a39-4387-971f-13c59e047664" (UID: "397b829d-4a39-4387-971f-13c59e047664"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:35:40 crc kubenswrapper[4546]: I0201 08:35:40.938646 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf" (OuterVolumeSpecName: "kube-api-access-s6jxf") pod "397b829d-4a39-4387-971f-13c59e047664" (UID: "397b829d-4a39-4387-971f-13c59e047664"). InnerVolumeSpecName "kube-api-access-s6jxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.004191 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "397b829d-4a39-4387-971f-13c59e047664" (UID: "397b829d-4a39-4387-971f-13c59e047664"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.030701 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.030909 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6jxf\" (UniqueName: \"kubernetes.io/projected/397b829d-4a39-4387-971f-13c59e047664-kube-api-access-s6jxf\") on node \"crc\" DevicePath \"\"" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.030982 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397b829d-4a39-4387-971f-13c59e047664-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.144936 4546 generic.go:334] "Generic (PLEG): container finished" podID="397b829d-4a39-4387-971f-13c59e047664" containerID="eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8" exitCode=0 Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.144983 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerDied","Data":"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8"} Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.145000 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp5vd" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.145018 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp5vd" event={"ID":"397b829d-4a39-4387-971f-13c59e047664","Type":"ContainerDied","Data":"3fcfe29f82b5d64029e62001dbe6fb432b68b67b37fb8548df5988f7075c1904"} Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.145039 4546 scope.go:117] "RemoveContainer" containerID="eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.183050 4546 scope.go:117] "RemoveContainer" containerID="91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.208184 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.210588 4546 scope.go:117] "RemoveContainer" containerID="1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.218299 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fp5vd"] Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.250394 4546 scope.go:117] "RemoveContainer" containerID="eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8" Feb 01 08:35:41 crc kubenswrapper[4546]: E0201 08:35:41.250783 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8\": container with ID starting with eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8 not found: ID does not exist" containerID="eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.250819 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8"} err="failed to get container status \"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8\": rpc error: code = NotFound desc = could not find container \"eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8\": container with ID starting with eb2258dd7362163a1a353b1abe7fba23e6721e03d39ce6a97ea38857af321eb8 not found: ID does not exist" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.250845 4546 scope.go:117] "RemoveContainer" containerID="91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23" Feb 01 08:35:41 crc kubenswrapper[4546]: E0201 08:35:41.251095 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23\": container with ID starting with 91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23 not found: ID does not exist" containerID="91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.251111 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23"} err="failed to get container status \"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23\": rpc error: code = NotFound desc = could not find container \"91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23\": container with ID starting with 91d278c3d5adbc9e7591fac3813115a005aeb95bfe4c7cc7769b7a600f67cb23 not found: ID does not exist" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.251122 4546 scope.go:117] "RemoveContainer" containerID="1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e" Feb 01 08:35:41 crc kubenswrapper[4546]: E0201 08:35:41.251492 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e\": container with ID starting with 1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e not found: ID does not exist" containerID="1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.251507 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e"} err="failed to get container status \"1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e\": rpc error: code = NotFound desc = could not find container \"1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e\": container with ID starting with 1093a736e65bf8f51553ba7fe6246918765fed4d8cbc01d55a38a01c4e40836e not found: ID does not exist" Feb 01 08:35:41 crc kubenswrapper[4546]: I0201 08:35:41.666032 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397b829d-4a39-4387-971f-13c59e047664" path="/var/lib/kubelet/pods/397b829d-4a39-4387-971f-13c59e047664/volumes" Feb 01 08:36:25 crc kubenswrapper[4546]: I0201 08:36:25.421314 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:36:25 crc kubenswrapper[4546]: I0201 08:36:25.422301 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:36:55 crc kubenswrapper[4546]: I0201 08:36:55.421077 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:36:55 crc kubenswrapper[4546]: I0201 08:36:55.421480 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:36:59 crc kubenswrapper[4546]: E0201 08:36:59.238650 4546 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.26.196:51380->192.168.26.196:40843: read tcp 192.168.26.196:51380->192.168.26.196:40843: read: connection reset by peer Feb 01 08:37:25 crc kubenswrapper[4546]: I0201 08:37:25.420713 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:37:25 crc kubenswrapper[4546]: I0201 08:37:25.421380 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:37:25 crc kubenswrapper[4546]: I0201 08:37:25.421442 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:37:25 crc kubenswrapper[4546]: I0201 08:37:25.422805 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:37:25 crc kubenswrapper[4546]: I0201 08:37:25.422910 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6" gracePeriod=600 Feb 01 08:37:26 crc kubenswrapper[4546]: I0201 08:37:26.176535 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6" exitCode=0 Feb 01 08:37:26 crc kubenswrapper[4546]: I0201 08:37:26.176599 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6"} Feb 01 08:37:26 crc kubenswrapper[4546]: I0201 08:37:26.177027 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676"} Feb 01 08:37:26 crc kubenswrapper[4546]: I0201 08:37:26.177048 4546 scope.go:117] "RemoveContainer" containerID="e38cfa2c3ac96667f583aa0347a2bbecb04a96d6a9691f159934b5fa6cf711fa" Feb 01 08:39:25 crc kubenswrapper[4546]: I0201 08:39:25.420672 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:39:25 crc kubenswrapper[4546]: I0201 08:39:25.421244 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:39:55 crc kubenswrapper[4546]: I0201 08:39:55.420572 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:39:55 crc kubenswrapper[4546]: I0201 08:39:55.421244 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.421379 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.422280 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.422358 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.423718 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.423802 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" gracePeriod=600 Feb 01 08:40:25 crc kubenswrapper[4546]: E0201 08:40:25.549919 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.862237 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" exitCode=0 Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.862583 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676"} Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.862640 4546 scope.go:117] "RemoveContainer" containerID="2e0e99976bb04e7150ed3e35dc3b13b29f7813e94e517db90cbadca43f6f05f6" Feb 01 08:40:25 crc kubenswrapper[4546]: I0201 08:40:25.863530 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:40:25 crc kubenswrapper[4546]: E0201 08:40:25.863957 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:40:39 crc kubenswrapper[4546]: I0201 08:40:39.667328 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:40:39 crc kubenswrapper[4546]: E0201 08:40:39.672743 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:40:51 crc kubenswrapper[4546]: I0201 08:40:51.655680 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:40:51 crc kubenswrapper[4546]: E0201 08:40:51.656738 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:05 crc kubenswrapper[4546]: I0201 08:41:05.657377 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:41:05 crc kubenswrapper[4546]: E0201 08:41:05.658728 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:17 crc kubenswrapper[4546]: I0201 08:41:17.655382 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:41:17 crc kubenswrapper[4546]: E0201 08:41:17.656473 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:28 crc kubenswrapper[4546]: I0201 08:41:28.654518 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:41:28 crc kubenswrapper[4546]: E0201 08:41:28.655297 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.751162 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:37 crc kubenswrapper[4546]: E0201 08:41:37.752149 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="registry-server" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.752164 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="registry-server" Feb 01 08:41:37 crc kubenswrapper[4546]: E0201 08:41:37.752197 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="extract-content" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.752203 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="extract-content" Feb 01 08:41:37 crc kubenswrapper[4546]: E0201 08:41:37.752232 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="extract-utilities" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.752237 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="extract-utilities" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.752421 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="397b829d-4a39-4387-971f-13c59e047664" containerName="registry-server" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.753678 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.772583 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.902575 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.902646 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7df2\" (UniqueName: \"kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:37 crc kubenswrapper[4546]: I0201 08:41:37.902814 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.005316 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.005509 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.005571 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7df2\" (UniqueName: \"kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.005750 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.005928 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.033683 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7df2\" (UniqueName: \"kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2\") pod \"community-operators-pzjcb\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.072788 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.511013 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:38 crc kubenswrapper[4546]: I0201 08:41:38.546739 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerStarted","Data":"38b489dd6e7f36a877790e977554ac8186ce567ae6b31d8b9383acec48093251"} Feb 01 08:41:39 crc kubenswrapper[4546]: I0201 08:41:39.558316 4546 generic.go:334] "Generic (PLEG): container finished" podID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerID="92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab" exitCode=0 Feb 01 08:41:39 crc kubenswrapper[4546]: I0201 08:41:39.558427 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerDied","Data":"92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab"} Feb 01 08:41:39 crc kubenswrapper[4546]: I0201 08:41:39.561932 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:41:39 crc kubenswrapper[4546]: I0201 08:41:39.660526 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:41:39 crc kubenswrapper[4546]: E0201 08:41:39.675641 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:40 crc kubenswrapper[4546]: I0201 08:41:40.569892 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerStarted","Data":"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd"} Feb 01 08:41:41 crc kubenswrapper[4546]: I0201 08:41:41.582374 4546 generic.go:334] "Generic (PLEG): container finished" podID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerID="2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd" exitCode=0 Feb 01 08:41:41 crc kubenswrapper[4546]: I0201 08:41:41.582474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerDied","Data":"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd"} Feb 01 08:41:42 crc kubenswrapper[4546]: I0201 08:41:42.596419 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerStarted","Data":"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e"} Feb 01 08:41:42 crc kubenswrapper[4546]: I0201 08:41:42.616759 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzjcb" podStartSLOduration=3.058881111 podStartE2EDuration="5.6167325s" podCreationTimestamp="2026-02-01 08:41:37 +0000 UTC" firstStartedPulling="2026-02-01 08:41:39.561260306 +0000 UTC m=+7130.212196322" lastFinishedPulling="2026-02-01 08:41:42.119111695 +0000 UTC m=+7132.770047711" observedRunningTime="2026-02-01 08:41:42.614523475 +0000 UTC m=+7133.265459491" watchObservedRunningTime="2026-02-01 08:41:42.6167325 +0000 UTC m=+7133.267668516" Feb 01 08:41:48 crc kubenswrapper[4546]: I0201 08:41:48.073542 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:48 crc kubenswrapper[4546]: I0201 08:41:48.074048 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:48 crc kubenswrapper[4546]: I0201 08:41:48.113433 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:48 crc kubenswrapper[4546]: I0201 08:41:48.701442 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:48 crc kubenswrapper[4546]: I0201 08:41:48.748390 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:50 crc kubenswrapper[4546]: I0201 08:41:50.666885 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzjcb" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="registry-server" containerID="cri-o://0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e" gracePeriod=2 Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.228576 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.331350 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7df2\" (UniqueName: \"kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2\") pod \"40bfb484-e7d8-445c-ab5a-df774cfc772f\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.331444 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities\") pod \"40bfb484-e7d8-445c-ab5a-df774cfc772f\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.331508 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content\") pod \"40bfb484-e7d8-445c-ab5a-df774cfc772f\" (UID: \"40bfb484-e7d8-445c-ab5a-df774cfc772f\") " Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.332603 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities" (OuterVolumeSpecName: "utilities") pod "40bfb484-e7d8-445c-ab5a-df774cfc772f" (UID: "40bfb484-e7d8-445c-ab5a-df774cfc772f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.362008 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2" (OuterVolumeSpecName: "kube-api-access-k7df2") pod "40bfb484-e7d8-445c-ab5a-df774cfc772f" (UID: "40bfb484-e7d8-445c-ab5a-df774cfc772f"). InnerVolumeSpecName "kube-api-access-k7df2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.387249 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40bfb484-e7d8-445c-ab5a-df774cfc772f" (UID: "40bfb484-e7d8-445c-ab5a-df774cfc772f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.433809 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.433842 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40bfb484-e7d8-445c-ab5a-df774cfc772f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.433867 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7df2\" (UniqueName: \"kubernetes.io/projected/40bfb484-e7d8-445c-ab5a-df774cfc772f-kube-api-access-k7df2\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.679464 4546 generic.go:334] "Generic (PLEG): container finished" podID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerID="0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e" exitCode=0 Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.679807 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerDied","Data":"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e"} Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.679839 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzjcb" event={"ID":"40bfb484-e7d8-445c-ab5a-df774cfc772f","Type":"ContainerDied","Data":"38b489dd6e7f36a877790e977554ac8186ce567ae6b31d8b9383acec48093251"} Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.679878 4546 scope.go:117] "RemoveContainer" containerID="0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.679995 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzjcb" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.705657 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.712108 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzjcb"] Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.721106 4546 scope.go:117] "RemoveContainer" containerID="2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.739613 4546 scope.go:117] "RemoveContainer" containerID="92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.780585 4546 scope.go:117] "RemoveContainer" containerID="0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e" Feb 01 08:41:51 crc kubenswrapper[4546]: E0201 08:41:51.784348 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e\": container with ID starting with 0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e not found: ID does not exist" containerID="0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.784394 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e"} err="failed to get container status \"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e\": rpc error: code = NotFound desc = could not find container \"0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e\": container with ID starting with 0cb9b75914682132140ab674603de342a5cefd765a6847054b2f6999b6e8a63e not found: ID does not exist" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.784420 4546 scope.go:117] "RemoveContainer" containerID="2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd" Feb 01 08:41:51 crc kubenswrapper[4546]: E0201 08:41:51.784728 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd\": container with ID starting with 2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd not found: ID does not exist" containerID="2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.784753 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd"} err="failed to get container status \"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd\": rpc error: code = NotFound desc = could not find container \"2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd\": container with ID starting with 2acf46e57d53b03ed859024357a17f1f40bf84d80143800a48574f44fdc6acbd not found: ID does not exist" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.784771 4546 scope.go:117] "RemoveContainer" containerID="92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab" Feb 01 08:41:51 crc kubenswrapper[4546]: E0201 08:41:51.786268 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab\": container with ID starting with 92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab not found: ID does not exist" containerID="92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab" Feb 01 08:41:51 crc kubenswrapper[4546]: I0201 08:41:51.786291 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab"} err="failed to get container status \"92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab\": rpc error: code = NotFound desc = could not find container \"92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab\": container with ID starting with 92ca9adf644c1e3b725fff9c4e64050490330e3e2a11c3fbc62a943bb3b489ab not found: ID does not exist" Feb 01 08:41:52 crc kubenswrapper[4546]: I0201 08:41:52.655347 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:41:52 crc kubenswrapper[4546]: E0201 08:41:52.655626 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:41:53 crc kubenswrapper[4546]: I0201 08:41:53.664009 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" path="/var/lib/kubelet/pods/40bfb484-e7d8-445c-ab5a-df774cfc772f/volumes" Feb 01 08:42:07 crc kubenswrapper[4546]: I0201 08:42:07.655164 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:42:07 crc kubenswrapper[4546]: E0201 08:42:07.656387 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:42:21 crc kubenswrapper[4546]: I0201 08:42:21.654600 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:42:21 crc kubenswrapper[4546]: E0201 08:42:21.655520 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.853590 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:28 crc kubenswrapper[4546]: E0201 08:42:28.854733 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="registry-server" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.854751 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="registry-server" Feb 01 08:42:28 crc kubenswrapper[4546]: E0201 08:42:28.854776 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="extract-utilities" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.854782 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="extract-utilities" Feb 01 08:42:28 crc kubenswrapper[4546]: E0201 08:42:28.854795 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="extract-content" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.854803 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="extract-content" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.857616 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bfb484-e7d8-445c-ab5a-df774cfc772f" containerName="registry-server" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.860668 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.887220 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.966259 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.966614 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscpr\" (UniqueName: \"kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:28 crc kubenswrapper[4546]: I0201 08:42:28.966921 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.069559 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.069971 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.070066 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.070224 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hscpr\" (UniqueName: \"kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.070325 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.091495 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscpr\" (UniqueName: \"kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr\") pod \"certified-operators-pg5nr\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:29 crc kubenswrapper[4546]: I0201 08:42:29.184308 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:30 crc kubenswrapper[4546]: I0201 08:42:29.889467 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:30 crc kubenswrapper[4546]: I0201 08:42:30.079367 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerStarted","Data":"39323054e7b8aa7d692396cdb1345c6174eb1970195e9a248f4b2caaec866c04"} Feb 01 08:42:31 crc kubenswrapper[4546]: I0201 08:42:31.092393 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerID="04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2" exitCode=0 Feb 01 08:42:31 crc kubenswrapper[4546]: I0201 08:42:31.092474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerDied","Data":"04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2"} Feb 01 08:42:32 crc kubenswrapper[4546]: I0201 08:42:32.106405 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerStarted","Data":"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50"} Feb 01 08:42:33 crc kubenswrapper[4546]: I0201 08:42:33.119301 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerID="7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50" exitCode=0 Feb 01 08:42:33 crc kubenswrapper[4546]: I0201 08:42:33.119360 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerDied","Data":"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50"} Feb 01 08:42:34 crc kubenswrapper[4546]: I0201 08:42:34.129670 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerStarted","Data":"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25"} Feb 01 08:42:34 crc kubenswrapper[4546]: I0201 08:42:34.161419 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pg5nr" podStartSLOduration=3.675325483 podStartE2EDuration="6.161403888s" podCreationTimestamp="2026-02-01 08:42:28 +0000 UTC" firstStartedPulling="2026-02-01 08:42:31.095063463 +0000 UTC m=+7181.745999480" lastFinishedPulling="2026-02-01 08:42:33.581141869 +0000 UTC m=+7184.232077885" observedRunningTime="2026-02-01 08:42:34.158917009 +0000 UTC m=+7184.809853025" watchObservedRunningTime="2026-02-01 08:42:34.161403888 +0000 UTC m=+7184.812339904" Feb 01 08:42:35 crc kubenswrapper[4546]: I0201 08:42:35.655621 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:42:35 crc kubenswrapper[4546]: E0201 08:42:35.657506 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:42:39 crc kubenswrapper[4546]: I0201 08:42:39.186017 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:39 crc kubenswrapper[4546]: I0201 08:42:39.186416 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:39 crc kubenswrapper[4546]: I0201 08:42:39.223916 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:40 crc kubenswrapper[4546]: I0201 08:42:40.218555 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:40 crc kubenswrapper[4546]: I0201 08:42:40.283133 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.195876 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pg5nr" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="registry-server" containerID="cri-o://711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25" gracePeriod=2 Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.673947 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.785639 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities\") pod \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.785967 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hscpr\" (UniqueName: \"kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr\") pod \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.786357 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities" (OuterVolumeSpecName: "utilities") pod "1c643a5d-ca81-40d2-9efb-88279b3bb64a" (UID: "1c643a5d-ca81-40d2-9efb-88279b3bb64a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.787265 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content\") pod \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\" (UID: \"1c643a5d-ca81-40d2-9efb-88279b3bb64a\") " Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.788329 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.794926 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr" (OuterVolumeSpecName: "kube-api-access-hscpr") pod "1c643a5d-ca81-40d2-9efb-88279b3bb64a" (UID: "1c643a5d-ca81-40d2-9efb-88279b3bb64a"). InnerVolumeSpecName "kube-api-access-hscpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.828062 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c643a5d-ca81-40d2-9efb-88279b3bb64a" (UID: "1c643a5d-ca81-40d2-9efb-88279b3bb64a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.890106 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hscpr\" (UniqueName: \"kubernetes.io/projected/1c643a5d-ca81-40d2-9efb-88279b3bb64a-kube-api-access-hscpr\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:42 crc kubenswrapper[4546]: I0201 08:42:42.890139 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c643a5d-ca81-40d2-9efb-88279b3bb64a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.206643 4546 generic.go:334] "Generic (PLEG): container finished" podID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerID="711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25" exitCode=0 Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.206717 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg5nr" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.206740 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerDied","Data":"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25"} Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.207150 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg5nr" event={"ID":"1c643a5d-ca81-40d2-9efb-88279b3bb64a","Type":"ContainerDied","Data":"39323054e7b8aa7d692396cdb1345c6174eb1970195e9a248f4b2caaec866c04"} Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.207186 4546 scope.go:117] "RemoveContainer" containerID="711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.234465 4546 scope.go:117] "RemoveContainer" containerID="7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.239297 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.260995 4546 scope.go:117] "RemoveContainer" containerID="04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.262004 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pg5nr"] Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.298110 4546 scope.go:117] "RemoveContainer" containerID="711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25" Feb 01 08:42:43 crc kubenswrapper[4546]: E0201 08:42:43.299248 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25\": container with ID starting with 711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25 not found: ID does not exist" containerID="711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.299384 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25"} err="failed to get container status \"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25\": rpc error: code = NotFound desc = could not find container \"711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25\": container with ID starting with 711bdc1416dfa692b6b59441b96de6c089734a82d496ffa182b86ff2936baa25 not found: ID does not exist" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.299472 4546 scope.go:117] "RemoveContainer" containerID="7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50" Feb 01 08:42:43 crc kubenswrapper[4546]: E0201 08:42:43.299914 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50\": container with ID starting with 7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50 not found: ID does not exist" containerID="7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.299949 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50"} err="failed to get container status \"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50\": rpc error: code = NotFound desc = could not find container \"7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50\": container with ID starting with 7b7c64820dcb4d9a91c92183cfea769da00b4a2494af3d7fb630c9cbfd9b5f50 not found: ID does not exist" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.299971 4546 scope.go:117] "RemoveContainer" containerID="04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2" Feb 01 08:42:43 crc kubenswrapper[4546]: E0201 08:42:43.301102 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2\": container with ID starting with 04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2 not found: ID does not exist" containerID="04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.301126 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2"} err="failed to get container status \"04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2\": rpc error: code = NotFound desc = could not find container \"04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2\": container with ID starting with 04ab3ec07e51713c7f32f8b1772f14c8fdff451e5fcef633b3027401dea3ecb2 not found: ID does not exist" Feb 01 08:42:43 crc kubenswrapper[4546]: I0201 08:42:43.665635 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" path="/var/lib/kubelet/pods/1c643a5d-ca81-40d2-9efb-88279b3bb64a/volumes" Feb 01 08:42:49 crc kubenswrapper[4546]: I0201 08:42:49.660216 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:42:49 crc kubenswrapper[4546]: E0201 08:42:49.661194 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:43:04 crc kubenswrapper[4546]: I0201 08:43:04.655299 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:43:04 crc kubenswrapper[4546]: E0201 08:43:04.656480 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:43:15 crc kubenswrapper[4546]: I0201 08:43:15.655664 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:43:15 crc kubenswrapper[4546]: E0201 08:43:15.657028 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:43:28 crc kubenswrapper[4546]: I0201 08:43:28.655332 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:43:28 crc kubenswrapper[4546]: E0201 08:43:28.656308 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:43:41 crc kubenswrapper[4546]: I0201 08:43:41.654623 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:43:41 crc kubenswrapper[4546]: E0201 08:43:41.655629 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:43:55 crc kubenswrapper[4546]: I0201 08:43:55.656477 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:43:55 crc kubenswrapper[4546]: E0201 08:43:55.658479 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:44:09 crc kubenswrapper[4546]: I0201 08:44:09.661183 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:44:09 crc kubenswrapper[4546]: E0201 08:44:09.662124 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:44:22 crc kubenswrapper[4546]: I0201 08:44:22.655924 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:44:22 crc kubenswrapper[4546]: E0201 08:44:22.657142 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:44:35 crc kubenswrapper[4546]: I0201 08:44:35.655572 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:44:35 crc kubenswrapper[4546]: E0201 08:44:35.657472 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:44:46 crc kubenswrapper[4546]: I0201 08:44:46.655730 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:44:46 crc kubenswrapper[4546]: E0201 08:44:46.656540 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.236167 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq"] Feb 01 08:45:00 crc kubenswrapper[4546]: E0201 08:45:00.237253 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="registry-server" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.237271 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="registry-server" Feb 01 08:45:00 crc kubenswrapper[4546]: E0201 08:45:00.237286 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="extract-utilities" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.237291 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="extract-utilities" Feb 01 08:45:00 crc kubenswrapper[4546]: E0201 08:45:00.237301 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="extract-content" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.237306 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="extract-content" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.237517 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c643a5d-ca81-40d2-9efb-88279b3bb64a" containerName="registry-server" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.238241 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.249113 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.251280 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq"] Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.251351 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.279634 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.279754 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.279916 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62g4s\" (UniqueName: \"kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.382277 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.382489 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62g4s\" (UniqueName: \"kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.382713 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.383105 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.389015 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.399816 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62g4s\" (UniqueName: \"kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s\") pod \"collect-profiles-29498925-cn7tq\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.566616 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:00 crc kubenswrapper[4546]: I0201 08:45:00.656788 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:45:00 crc kubenswrapper[4546]: E0201 08:45:00.657254 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:45:01 crc kubenswrapper[4546]: I0201 08:45:01.032187 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq"] Feb 01 08:45:01 crc kubenswrapper[4546]: I0201 08:45:01.426281 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" event={"ID":"992f8a57-9795-4977-aa4c-f7b629741561","Type":"ContainerStarted","Data":"0d6cc575a7dea551e5ad616145f0ae0ff437d0a3c795ba825c21d57bb8b0da43"} Feb 01 08:45:01 crc kubenswrapper[4546]: I0201 08:45:01.426335 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" event={"ID":"992f8a57-9795-4977-aa4c-f7b629741561","Type":"ContainerStarted","Data":"6ac49095f80069b85a6c7e53abeab822a52b68f5aab6ced2636e9fd249f37f88"} Feb 01 08:45:01 crc kubenswrapper[4546]: I0201 08:45:01.454271 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" podStartSLOduration=1.454254149 podStartE2EDuration="1.454254149s" podCreationTimestamp="2026-02-01 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:01.450665133 +0000 UTC m=+7332.101601150" watchObservedRunningTime="2026-02-01 08:45:01.454254149 +0000 UTC m=+7332.105190166" Feb 01 08:45:02 crc kubenswrapper[4546]: I0201 08:45:02.436628 4546 generic.go:334] "Generic (PLEG): container finished" podID="992f8a57-9795-4977-aa4c-f7b629741561" containerID="0d6cc575a7dea551e5ad616145f0ae0ff437d0a3c795ba825c21d57bb8b0da43" exitCode=0 Feb 01 08:45:02 crc kubenswrapper[4546]: I0201 08:45:02.436747 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" event={"ID":"992f8a57-9795-4977-aa4c-f7b629741561","Type":"ContainerDied","Data":"0d6cc575a7dea551e5ad616145f0ae0ff437d0a3c795ba825c21d57bb8b0da43"} Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.793079 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.865408 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume\") pod \"992f8a57-9795-4977-aa4c-f7b629741561\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.865461 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume\") pod \"992f8a57-9795-4977-aa4c-f7b629741561\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.865544 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62g4s\" (UniqueName: \"kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s\") pod \"992f8a57-9795-4977-aa4c-f7b629741561\" (UID: \"992f8a57-9795-4977-aa4c-f7b629741561\") " Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.867151 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume" (OuterVolumeSpecName: "config-volume") pod "992f8a57-9795-4977-aa4c-f7b629741561" (UID: "992f8a57-9795-4977-aa4c-f7b629741561"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.873157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "992f8a57-9795-4977-aa4c-f7b629741561" (UID: "992f8a57-9795-4977-aa4c-f7b629741561"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.873548 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s" (OuterVolumeSpecName: "kube-api-access-62g4s") pod "992f8a57-9795-4977-aa4c-f7b629741561" (UID: "992f8a57-9795-4977-aa4c-f7b629741561"). InnerVolumeSpecName "kube-api-access-62g4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.968076 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/992f8a57-9795-4977-aa4c-f7b629741561-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.968109 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/992f8a57-9795-4977-aa4c-f7b629741561-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[4546]: I0201 08:45:03.968120 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62g4s\" (UniqueName: \"kubernetes.io/projected/992f8a57-9795-4977-aa4c-f7b629741561-kube-api-access-62g4s\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:04 crc kubenswrapper[4546]: I0201 08:45:04.455115 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" event={"ID":"992f8a57-9795-4977-aa4c-f7b629741561","Type":"ContainerDied","Data":"6ac49095f80069b85a6c7e53abeab822a52b68f5aab6ced2636e9fd249f37f88"} Feb 01 08:45:04 crc kubenswrapper[4546]: I0201 08:45:04.455180 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-cn7tq" Feb 01 08:45:04 crc kubenswrapper[4546]: I0201 08:45:04.455521 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac49095f80069b85a6c7e53abeab822a52b68f5aab6ced2636e9fd249f37f88" Feb 01 08:45:04 crc kubenswrapper[4546]: I0201 08:45:04.542846 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx"] Feb 01 08:45:04 crc kubenswrapper[4546]: I0201 08:45:04.552840 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-d8jrx"] Feb 01 08:45:05 crc kubenswrapper[4546]: I0201 08:45:05.664004 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f4ab5a-49be-4c8f-9803-620b76bea9e0" path="/var/lib/kubelet/pods/38f4ab5a-49be-4c8f-9803-620b76bea9e0/volumes" Feb 01 08:45:14 crc kubenswrapper[4546]: I0201 08:45:14.654730 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:45:14 crc kubenswrapper[4546]: E0201 08:45:14.655784 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:45:27 crc kubenswrapper[4546]: I0201 08:45:27.655876 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:45:28 crc kubenswrapper[4546]: I0201 08:45:28.647369 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969"} Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.079684 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:45:44 crc kubenswrapper[4546]: E0201 08:45:44.081087 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992f8a57-9795-4977-aa4c-f7b629741561" containerName="collect-profiles" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.081110 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="992f8a57-9795-4977-aa4c-f7b629741561" containerName="collect-profiles" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.081387 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="992f8a57-9795-4977-aa4c-f7b629741561" containerName="collect-profiles" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.084138 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.115131 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.178780 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.178824 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ss2f\" (UniqueName: \"kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.179101 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.280809 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.280873 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ss2f\" (UniqueName: \"kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.281006 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.281577 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.281717 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.314288 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ss2f\" (UniqueName: \"kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f\") pod \"redhat-operators-x6xsd\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.406602 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:44 crc kubenswrapper[4546]: I0201 08:45:44.830327 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:45:45 crc kubenswrapper[4546]: I0201 08:45:45.800352 4546 generic.go:334] "Generic (PLEG): container finished" podID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerID="25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e" exitCode=0 Feb 01 08:45:45 crc kubenswrapper[4546]: I0201 08:45:45.800784 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerDied","Data":"25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e"} Feb 01 08:45:45 crc kubenswrapper[4546]: I0201 08:45:45.800917 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerStarted","Data":"28aac8030bd0991abd54e50231a949cd7cf06969ed38adb83503873f0c92fa27"} Feb 01 08:45:46 crc kubenswrapper[4546]: I0201 08:45:46.810604 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerStarted","Data":"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411"} Feb 01 08:45:50 crc kubenswrapper[4546]: I0201 08:45:50.875422 4546 generic.go:334] "Generic (PLEG): container finished" podID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerID="7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411" exitCode=0 Feb 01 08:45:50 crc kubenswrapper[4546]: I0201 08:45:50.875599 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerDied","Data":"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411"} Feb 01 08:45:51 crc kubenswrapper[4546]: I0201 08:45:51.889774 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerStarted","Data":"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2"} Feb 01 08:45:51 crc kubenswrapper[4546]: I0201 08:45:51.912277 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6xsd" podStartSLOduration=2.357558088 podStartE2EDuration="7.91225054s" podCreationTimestamp="2026-02-01 08:45:44 +0000 UTC" firstStartedPulling="2026-02-01 08:45:45.802620699 +0000 UTC m=+7376.453556715" lastFinishedPulling="2026-02-01 08:45:51.357313161 +0000 UTC m=+7382.008249167" observedRunningTime="2026-02-01 08:45:51.910789565 +0000 UTC m=+7382.561725582" watchObservedRunningTime="2026-02-01 08:45:51.91225054 +0000 UTC m=+7382.563186556" Feb 01 08:45:54 crc kubenswrapper[4546]: I0201 08:45:54.407033 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:54 crc kubenswrapper[4546]: I0201 08:45:54.407415 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:45:55 crc kubenswrapper[4546]: I0201 08:45:55.450343 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6xsd" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="registry-server" probeResult="failure" output=< Feb 01 08:45:55 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:45:55 crc kubenswrapper[4546]: > Feb 01 08:46:00 crc kubenswrapper[4546]: I0201 08:46:00.064511 4546 scope.go:117] "RemoveContainer" containerID="e9f3083d4991f9c8b6c03036c9f71b017f1f5706dd2e38eaacbaa007a4dc187f" Feb 01 08:46:04 crc kubenswrapper[4546]: I0201 08:46:04.446631 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:46:04 crc kubenswrapper[4546]: I0201 08:46:04.494937 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:46:04 crc kubenswrapper[4546]: I0201 08:46:04.683841 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.024789 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6xsd" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="registry-server" containerID="cri-o://92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2" gracePeriod=2 Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.682740 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.708664 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ss2f\" (UniqueName: \"kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f\") pod \"b3f46a92-76c8-4188-a149-7a221fa36b9e\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.708808 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content\") pod \"b3f46a92-76c8-4188-a149-7a221fa36b9e\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.708947 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities\") pod \"b3f46a92-76c8-4188-a149-7a221fa36b9e\" (UID: \"b3f46a92-76c8-4188-a149-7a221fa36b9e\") " Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.709512 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities" (OuterVolumeSpecName: "utilities") pod "b3f46a92-76c8-4188-a149-7a221fa36b9e" (UID: "b3f46a92-76c8-4188-a149-7a221fa36b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.710654 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.731816 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f" (OuterVolumeSpecName: "kube-api-access-5ss2f") pod "b3f46a92-76c8-4188-a149-7a221fa36b9e" (UID: "b3f46a92-76c8-4188-a149-7a221fa36b9e"). InnerVolumeSpecName "kube-api-access-5ss2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.792631 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3f46a92-76c8-4188-a149-7a221fa36b9e" (UID: "b3f46a92-76c8-4188-a149-7a221fa36b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.812471 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ss2f\" (UniqueName: \"kubernetes.io/projected/b3f46a92-76c8-4188-a149-7a221fa36b9e-kube-api-access-5ss2f\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:06 crc kubenswrapper[4546]: I0201 08:46:06.812653 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f46a92-76c8-4188-a149-7a221fa36b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.034011 4546 generic.go:334] "Generic (PLEG): container finished" podID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerID="92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2" exitCode=0 Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.034060 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerDied","Data":"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2"} Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.034094 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6xsd" event={"ID":"b3f46a92-76c8-4188-a149-7a221fa36b9e","Type":"ContainerDied","Data":"28aac8030bd0991abd54e50231a949cd7cf06969ed38adb83503873f0c92fa27"} Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.034117 4546 scope.go:117] "RemoveContainer" containerID="92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.034250 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6xsd" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.058115 4546 scope.go:117] "RemoveContainer" containerID="7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.081737 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.085060 4546 scope.go:117] "RemoveContainer" containerID="25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.098683 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6xsd"] Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.125019 4546 scope.go:117] "RemoveContainer" containerID="92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2" Feb 01 08:46:07 crc kubenswrapper[4546]: E0201 08:46:07.129367 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2\": container with ID starting with 92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2 not found: ID does not exist" containerID="92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.129409 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2"} err="failed to get container status \"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2\": rpc error: code = NotFound desc = could not find container \"92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2\": container with ID starting with 92e0011c97e7b7babe7ac6ff59d484c4e35ae2ea13cf062961df97f11bf21ba2 not found: ID does not exist" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.129437 4546 scope.go:117] "RemoveContainer" containerID="7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411" Feb 01 08:46:07 crc kubenswrapper[4546]: E0201 08:46:07.133926 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411\": container with ID starting with 7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411 not found: ID does not exist" containerID="7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.133959 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411"} err="failed to get container status \"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411\": rpc error: code = NotFound desc = could not find container \"7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411\": container with ID starting with 7678be1da58100884d0ca591877b90dd7982dd26034340b3e34b46bbf29a2411 not found: ID does not exist" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.133981 4546 scope.go:117] "RemoveContainer" containerID="25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e" Feb 01 08:46:07 crc kubenswrapper[4546]: E0201 08:46:07.143124 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e\": container with ID starting with 25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e not found: ID does not exist" containerID="25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.143160 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e"} err="failed to get container status \"25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e\": rpc error: code = NotFound desc = could not find container \"25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e\": container with ID starting with 25ebbfc722bfa9b504b9039c5fc9128e1ccfd6ed3e01f86c2930c9e201e2688e not found: ID does not exist" Feb 01 08:46:07 crc kubenswrapper[4546]: I0201 08:46:07.667304 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" path="/var/lib/kubelet/pods/b3f46a92-76c8-4188-a149-7a221fa36b9e/volumes" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.111986 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:27 crc kubenswrapper[4546]: E0201 08:46:27.113248 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="extract-content" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.113272 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="extract-content" Feb 01 08:46:27 crc kubenswrapper[4546]: E0201 08:46:27.113305 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="extract-utilities" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.113312 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="extract-utilities" Feb 01 08:46:27 crc kubenswrapper[4546]: E0201 08:46:27.113350 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="registry-server" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.113355 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="registry-server" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.113580 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f46a92-76c8-4188-a149-7a221fa36b9e" containerName="registry-server" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.115644 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.138684 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.235310 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.235434 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvtqs\" (UniqueName: \"kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.235473 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.339348 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.339490 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvtqs\" (UniqueName: \"kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.339523 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.339851 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.340113 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.365451 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvtqs\" (UniqueName: \"kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs\") pod \"redhat-marketplace-lx4nj\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.436483 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:27 crc kubenswrapper[4546]: I0201 08:46:27.881441 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:28 crc kubenswrapper[4546]: I0201 08:46:28.218984 4546 generic.go:334] "Generic (PLEG): container finished" podID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerID="f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537" exitCode=0 Feb 01 08:46:28 crc kubenswrapper[4546]: I0201 08:46:28.219050 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerDied","Data":"f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537"} Feb 01 08:46:28 crc kubenswrapper[4546]: I0201 08:46:28.219371 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerStarted","Data":"4e5dba3174cc87be2a566b6086fcfc4f7f9aa27f8ae2c391fd9282c37d3bd89e"} Feb 01 08:46:29 crc kubenswrapper[4546]: I0201 08:46:29.233971 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerStarted","Data":"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc"} Feb 01 08:46:30 crc kubenswrapper[4546]: I0201 08:46:30.242640 4546 generic.go:334] "Generic (PLEG): container finished" podID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerID="2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc" exitCode=0 Feb 01 08:46:30 crc kubenswrapper[4546]: I0201 08:46:30.242738 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerDied","Data":"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc"} Feb 01 08:46:31 crc kubenswrapper[4546]: I0201 08:46:31.254050 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerStarted","Data":"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76"} Feb 01 08:46:31 crc kubenswrapper[4546]: I0201 08:46:31.292784 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lx4nj" podStartSLOduration=1.7837575129999999 podStartE2EDuration="4.292760755s" podCreationTimestamp="2026-02-01 08:46:27 +0000 UTC" firstStartedPulling="2026-02-01 08:46:28.220649411 +0000 UTC m=+7418.871585427" lastFinishedPulling="2026-02-01 08:46:30.729652653 +0000 UTC m=+7421.380588669" observedRunningTime="2026-02-01 08:46:31.277243741 +0000 UTC m=+7421.928179748" watchObservedRunningTime="2026-02-01 08:46:31.292760755 +0000 UTC m=+7421.943696772" Feb 01 08:46:37 crc kubenswrapper[4546]: I0201 08:46:37.437636 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:37 crc kubenswrapper[4546]: I0201 08:46:37.438370 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:37 crc kubenswrapper[4546]: I0201 08:46:37.486197 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:38 crc kubenswrapper[4546]: I0201 08:46:38.356922 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:38 crc kubenswrapper[4546]: I0201 08:46:38.402108 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.338952 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lx4nj" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="registry-server" containerID="cri-o://1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76" gracePeriod=2 Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.882561 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.978753 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvtqs\" (UniqueName: \"kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs\") pod \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.978928 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content\") pod \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.978991 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities\") pod \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\" (UID: \"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80\") " Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.979829 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities" (OuterVolumeSpecName: "utilities") pod "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" (UID: "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.980114 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.988161 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs" (OuterVolumeSpecName: "kube-api-access-kvtqs") pod "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" (UID: "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80"). InnerVolumeSpecName "kube-api-access-kvtqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:46:40 crc kubenswrapper[4546]: I0201 08:46:40.996553 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" (UID: "a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.081615 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvtqs\" (UniqueName: \"kubernetes.io/projected/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-kube-api-access-kvtqs\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.081656 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.365064 4546 generic.go:334] "Generic (PLEG): container finished" podID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerID="1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76" exitCode=0 Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.365396 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerDied","Data":"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76"} Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.365440 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lx4nj" event={"ID":"a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80","Type":"ContainerDied","Data":"4e5dba3174cc87be2a566b6086fcfc4f7f9aa27f8ae2c391fd9282c37d3bd89e"} Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.365461 4546 scope.go:117] "RemoveContainer" containerID="1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.365648 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lx4nj" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.400380 4546 scope.go:117] "RemoveContainer" containerID="2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.415017 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.421785 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lx4nj"] Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.424300 4546 scope.go:117] "RemoveContainer" containerID="f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.456774 4546 scope.go:117] "RemoveContainer" containerID="1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76" Feb 01 08:46:41 crc kubenswrapper[4546]: E0201 08:46:41.457113 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76\": container with ID starting with 1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76 not found: ID does not exist" containerID="1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.457155 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76"} err="failed to get container status \"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76\": rpc error: code = NotFound desc = could not find container \"1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76\": container with ID starting with 1889222efc1ebb6a1cb38430336f12f7ea3e1c4be6d0eefaf50f48709d416e76 not found: ID does not exist" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.457180 4546 scope.go:117] "RemoveContainer" containerID="2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc" Feb 01 08:46:41 crc kubenswrapper[4546]: E0201 08:46:41.457456 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc\": container with ID starting with 2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc not found: ID does not exist" containerID="2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.457520 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc"} err="failed to get container status \"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc\": rpc error: code = NotFound desc = could not find container \"2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc\": container with ID starting with 2769dfabd76fdea8008bfd169d5e7ae630f4d0b24a14592515188dc62c306ecc not found: ID does not exist" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.457533 4546 scope.go:117] "RemoveContainer" containerID="f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537" Feb 01 08:46:41 crc kubenswrapper[4546]: E0201 08:46:41.457776 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537\": container with ID starting with f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537 not found: ID does not exist" containerID="f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.457807 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537"} err="failed to get container status \"f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537\": rpc error: code = NotFound desc = could not find container \"f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537\": container with ID starting with f13303467c39a28dd02b1a79c8313cf114e2feaaabf957298e67288fb2186537 not found: ID does not exist" Feb 01 08:46:41 crc kubenswrapper[4546]: I0201 08:46:41.666740 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" path="/var/lib/kubelet/pods/a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80/volumes" Feb 01 08:47:55 crc kubenswrapper[4546]: I0201 08:47:55.420796 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:47:55 crc kubenswrapper[4546]: I0201 08:47:55.422311 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:48:25 crc kubenswrapper[4546]: I0201 08:48:25.420492 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:48:25 crc kubenswrapper[4546]: I0201 08:48:25.421211 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.649058 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86c88c587c-hxf72"] Feb 01 08:48:43 crc kubenswrapper[4546]: E0201 08:48:43.652167 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="extract-content" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.652213 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="extract-content" Feb 01 08:48:43 crc kubenswrapper[4546]: E0201 08:48:43.652227 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="registry-server" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.652233 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="registry-server" Feb 01 08:48:43 crc kubenswrapper[4546]: E0201 08:48:43.652322 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="extract-utilities" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.652328 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="extract-utilities" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.653225 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5cb9ff2-46e2-4d8a-ab84-9a96f51cda80" containerName="registry-server" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.656207 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.715173 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86c88c587c-hxf72"] Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.777509 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-internal-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.778654 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-ovndb-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.778884 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjpgs\" (UniqueName: \"kubernetes.io/projected/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-kube-api-access-pjpgs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.778981 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-httpd-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.779090 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-public-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.779220 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.779390 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-combined-ca-bundle\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882274 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjpgs\" (UniqueName: \"kubernetes.io/projected/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-kube-api-access-pjpgs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882342 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-httpd-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882377 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-public-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882444 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882493 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-combined-ca-bundle\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882539 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-internal-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.882798 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-ovndb-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.891209 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-internal-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.891255 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-ovndb-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.891302 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-public-tls-certs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.897693 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-combined-ca-bundle\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.897697 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.897697 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-httpd-config\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.898982 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjpgs\" (UniqueName: \"kubernetes.io/projected/b6a00d61-5d12-4ee2-a74e-b11ec568dfab-kube-api-access-pjpgs\") pod \"neutron-86c88c587c-hxf72\" (UID: \"b6a00d61-5d12-4ee2-a74e-b11ec568dfab\") " pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:43 crc kubenswrapper[4546]: I0201 08:48:43.978833 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:44 crc kubenswrapper[4546]: I0201 08:48:44.974234 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86c88c587c-hxf72"] Feb 01 08:48:45 crc kubenswrapper[4546]: I0201 08:48:45.485696 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86c88c587c-hxf72" event={"ID":"b6a00d61-5d12-4ee2-a74e-b11ec568dfab","Type":"ContainerStarted","Data":"46699794dcb23cdda5143b4098ba5cd9be9fbd475c2f1c24048e7f811fefbcd0"} Feb 01 08:48:45 crc kubenswrapper[4546]: I0201 08:48:45.486344 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:48:45 crc kubenswrapper[4546]: I0201 08:48:45.486357 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86c88c587c-hxf72" event={"ID":"b6a00d61-5d12-4ee2-a74e-b11ec568dfab","Type":"ContainerStarted","Data":"5b3252bc2ee73a921b84ab29506bec46080d1883f9fe3c4ee090b6cf8df9bb5d"} Feb 01 08:48:45 crc kubenswrapper[4546]: I0201 08:48:45.486369 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86c88c587c-hxf72" event={"ID":"b6a00d61-5d12-4ee2-a74e-b11ec568dfab","Type":"ContainerStarted","Data":"014cc2325cab7639e4900c4cdedba5edcb377077369fa12d02f26955f675e0b0"} Feb 01 08:48:45 crc kubenswrapper[4546]: I0201 08:48:45.502961 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86c88c587c-hxf72" podStartSLOduration=2.502946202 podStartE2EDuration="2.502946202s" podCreationTimestamp="2026-02-01 08:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:48:45.501496227 +0000 UTC m=+7556.152432243" watchObservedRunningTime="2026-02-01 08:48:45.502946202 +0000 UTC m=+7556.153882217" Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.420447 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.421111 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.421162 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.422137 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.423014 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969" gracePeriod=600 Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.597652 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969" exitCode=0 Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.597943 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969"} Feb 01 08:48:55 crc kubenswrapper[4546]: I0201 08:48:55.598736 4546 scope.go:117] "RemoveContainer" containerID="4c5116d536fb524bdd2832af23d996650eb3e769f02e3dc94b80ea87bcdef676" Feb 01 08:48:56 crc kubenswrapper[4546]: I0201 08:48:56.608888 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75"} Feb 01 08:49:13 crc kubenswrapper[4546]: I0201 08:49:13.992921 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86c88c587c-hxf72" Feb 01 08:49:14 crc kubenswrapper[4546]: I0201 08:49:14.053917 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:49:14 crc kubenswrapper[4546]: I0201 08:49:14.058873 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d9bbd7fcc-ddszx" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-api" containerID="cri-o://85b2c2df3860ae2f5c4820cde0f230c9754869bc81c7299dbe288950bc1e0636" gracePeriod=30 Feb 01 08:49:14 crc kubenswrapper[4546]: I0201 08:49:14.058986 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d9bbd7fcc-ddszx" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-httpd" containerID="cri-o://201e45fb339bad634f368fe4234f202a0e4b7498680b7a8faae312806b4765e2" gracePeriod=30 Feb 01 08:49:14 crc kubenswrapper[4546]: I0201 08:49:14.777657 4546 generic.go:334] "Generic (PLEG): container finished" podID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerID="201e45fb339bad634f368fe4234f202a0e4b7498680b7a8faae312806b4765e2" exitCode=0 Feb 01 08:49:14 crc kubenswrapper[4546]: I0201 08:49:14.777708 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerDied","Data":"201e45fb339bad634f368fe4234f202a0e4b7498680b7a8faae312806b4765e2"} Feb 01 08:49:16 crc kubenswrapper[4546]: I0201 08:49:16.816425 4546 generic.go:334] "Generic (PLEG): container finished" podID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerID="85b2c2df3860ae2f5c4820cde0f230c9754869bc81c7299dbe288950bc1e0636" exitCode=0 Feb 01 08:49:16 crc kubenswrapper[4546]: I0201 08:49:16.816704 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerDied","Data":"85b2c2df3860ae2f5c4820cde0f230c9754869bc81c7299dbe288950bc1e0636"} Feb 01 08:49:16 crc kubenswrapper[4546]: I0201 08:49:16.973766 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051175 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z4w4\" (UniqueName: \"kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051232 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051327 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051395 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051510 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051538 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.051576 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle\") pod \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\" (UID: \"58ba942c-cc7f-4521-aa6b-8e141c861eb9\") " Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.063123 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4" (OuterVolumeSpecName: "kube-api-access-5z4w4") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "kube-api-access-5z4w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.078488 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.100480 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.112607 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config" (OuterVolumeSpecName: "config") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.114503 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.115388 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.125381 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "58ba942c-cc7f-4521-aa6b-8e141c861eb9" (UID: "58ba942c-cc7f-4521-aa6b-8e141c861eb9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153698 4546 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153725 4546 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153735 4546 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153742 4546 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153750 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153758 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z4w4\" (UniqueName: \"kubernetes.io/projected/58ba942c-cc7f-4521-aa6b-8e141c861eb9-kube-api-access-5z4w4\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.153766 4546 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58ba942c-cc7f-4521-aa6b-8e141c861eb9-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.827639 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d9bbd7fcc-ddszx" event={"ID":"58ba942c-cc7f-4521-aa6b-8e141c861eb9","Type":"ContainerDied","Data":"3d70bc5b4d2135051dd4ac42b8564487b1705cb3e47ea7937f5fbfd00acc7c8d"} Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.827692 4546 scope.go:117] "RemoveContainer" containerID="201e45fb339bad634f368fe4234f202a0e4b7498680b7a8faae312806b4765e2" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.827698 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d9bbd7fcc-ddszx" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.849341 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.851046 4546 scope.go:117] "RemoveContainer" containerID="85b2c2df3860ae2f5c4820cde0f230c9754869bc81c7299dbe288950bc1e0636" Feb 01 08:49:17 crc kubenswrapper[4546]: I0201 08:49:17.857333 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d9bbd7fcc-ddszx"] Feb 01 08:49:19 crc kubenswrapper[4546]: I0201 08:49:19.668164 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" path="/var/lib/kubelet/pods/58ba942c-cc7f-4521-aa6b-8e141c861eb9/volumes" Feb 01 08:50:25 crc kubenswrapper[4546]: E0201 08:50:25.317088 4546 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.196:49824->192.168.26.196:40843: write tcp 192.168.26.196:49824->192.168.26.196:40843: write: connection reset by peer Feb 01 08:50:55 crc kubenswrapper[4546]: I0201 08:50:55.420752 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:50:55 crc kubenswrapper[4546]: I0201 08:50:55.421447 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:51:25 crc kubenswrapper[4546]: I0201 08:51:25.421194 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:51:25 crc kubenswrapper[4546]: I0201 08:51:25.421706 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:51:55 crc kubenswrapper[4546]: I0201 08:51:55.421434 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:51:55 crc kubenswrapper[4546]: I0201 08:51:55.422095 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:51:55 crc kubenswrapper[4546]: I0201 08:51:55.422136 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 08:51:55 crc kubenswrapper[4546]: I0201 08:51:55.423026 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:51:55 crc kubenswrapper[4546]: I0201 08:51:55.423076 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" gracePeriod=600 Feb 01 08:51:55 crc kubenswrapper[4546]: E0201 08:51:55.542470 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:51:56 crc kubenswrapper[4546]: I0201 08:51:56.131355 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" exitCode=0 Feb 01 08:51:56 crc kubenswrapper[4546]: I0201 08:51:56.131429 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75"} Feb 01 08:51:56 crc kubenswrapper[4546]: I0201 08:51:56.131731 4546 scope.go:117] "RemoveContainer" containerID="10339ac69fb4b2e979aa6366536108cf1c17d49cc38aab41c61667c2a4fba969" Feb 01 08:51:56 crc kubenswrapper[4546]: I0201 08:51:56.132726 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:51:56 crc kubenswrapper[4546]: E0201 08:51:56.133209 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:08 crc kubenswrapper[4546]: I0201 08:52:08.655115 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:52:08 crc kubenswrapper[4546]: E0201 08:52:08.655824 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:23 crc kubenswrapper[4546]: I0201 08:52:23.655748 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:52:23 crc kubenswrapper[4546]: E0201 08:52:23.656921 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:34 crc kubenswrapper[4546]: I0201 08:52:34.655255 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:52:34 crc kubenswrapper[4546]: E0201 08:52:34.656145 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.943908 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:52:40 crc kubenswrapper[4546]: E0201 08:52:40.945220 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-httpd" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.945237 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-httpd" Feb 01 08:52:40 crc kubenswrapper[4546]: E0201 08:52:40.945279 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-api" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.945284 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-api" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.945518 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-api" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.945542 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ba942c-cc7f-4521-aa6b-8e141c861eb9" containerName="neutron-httpd" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.951005 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:40 crc kubenswrapper[4546]: I0201 08:52:40.975051 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.027016 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.027154 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.027459 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4kqh\" (UniqueName: \"kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.129879 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.130197 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.130461 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.130573 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4kqh\" (UniqueName: \"kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.130620 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.165309 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4kqh\" (UniqueName: \"kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh\") pod \"certified-operators-2vxl4\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.268927 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:41 crc kubenswrapper[4546]: I0201 08:52:41.729958 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:52:42 crc kubenswrapper[4546]: I0201 08:52:42.497981 4546 generic.go:334] "Generic (PLEG): container finished" podID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerID="3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78" exitCode=0 Feb 01 08:52:42 crc kubenswrapper[4546]: I0201 08:52:42.498035 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerDied","Data":"3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78"} Feb 01 08:52:42 crc kubenswrapper[4546]: I0201 08:52:42.498359 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerStarted","Data":"d07944cc3f3449eafcb45749128878fc2a375ed3fbc3b4e133f4bea8961fe398"} Feb 01 08:52:42 crc kubenswrapper[4546]: I0201 08:52:42.505621 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:52:43 crc kubenswrapper[4546]: I0201 08:52:43.514555 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerStarted","Data":"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4"} Feb 01 08:52:45 crc kubenswrapper[4546]: I0201 08:52:45.536617 4546 generic.go:334] "Generic (PLEG): container finished" podID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerID="5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4" exitCode=0 Feb 01 08:52:45 crc kubenswrapper[4546]: I0201 08:52:45.536699 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerDied","Data":"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4"} Feb 01 08:52:45 crc kubenswrapper[4546]: I0201 08:52:45.654368 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:52:45 crc kubenswrapper[4546]: E0201 08:52:45.654633 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.297095 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.299991 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.313182 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.353986 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.354287 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c8zl\" (UniqueName: \"kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.354447 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.456172 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.456315 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c8zl\" (UniqueName: \"kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.456384 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.456794 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.457017 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.479556 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c8zl\" (UniqueName: \"kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl\") pod \"community-operators-6qnrs\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.552210 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerStarted","Data":"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23"} Feb 01 08:52:46 crc kubenswrapper[4546]: I0201 08:52:46.621982 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:47 crc kubenswrapper[4546]: I0201 08:52:47.117003 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2vxl4" podStartSLOduration=3.4569395 podStartE2EDuration="7.116967489s" podCreationTimestamp="2026-02-01 08:52:40 +0000 UTC" firstStartedPulling="2026-02-01 08:52:42.502479615 +0000 UTC m=+7793.153415631" lastFinishedPulling="2026-02-01 08:52:46.162507614 +0000 UTC m=+7796.813443620" observedRunningTime="2026-02-01 08:52:46.578031744 +0000 UTC m=+7797.228967760" watchObservedRunningTime="2026-02-01 08:52:47.116967489 +0000 UTC m=+7797.767903505" Feb 01 08:52:47 crc kubenswrapper[4546]: I0201 08:52:47.141370 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:47 crc kubenswrapper[4546]: I0201 08:52:47.575811 4546 generic.go:334] "Generic (PLEG): container finished" podID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerID="8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42" exitCode=0 Feb 01 08:52:47 crc kubenswrapper[4546]: I0201 08:52:47.576980 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerDied","Data":"8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42"} Feb 01 08:52:47 crc kubenswrapper[4546]: I0201 08:52:47.577087 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerStarted","Data":"4fa803671d40524c3f85ec9d20467ae7b72240352bebc61d8bd6b7a11efc1a5b"} Feb 01 08:52:48 crc kubenswrapper[4546]: I0201 08:52:48.588719 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerStarted","Data":"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d"} Feb 01 08:52:49 crc kubenswrapper[4546]: I0201 08:52:49.598456 4546 generic.go:334] "Generic (PLEG): container finished" podID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerID="8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d" exitCode=0 Feb 01 08:52:49 crc kubenswrapper[4546]: I0201 08:52:49.598653 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerDied","Data":"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d"} Feb 01 08:52:50 crc kubenswrapper[4546]: I0201 08:52:50.610333 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerStarted","Data":"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e"} Feb 01 08:52:50 crc kubenswrapper[4546]: I0201 08:52:50.654735 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qnrs" podStartSLOduration=2.155932595 podStartE2EDuration="4.654711107s" podCreationTimestamp="2026-02-01 08:52:46 +0000 UTC" firstStartedPulling="2026-02-01 08:52:47.581753325 +0000 UTC m=+7798.232689340" lastFinishedPulling="2026-02-01 08:52:50.080531835 +0000 UTC m=+7800.731467852" observedRunningTime="2026-02-01 08:52:50.642952197 +0000 UTC m=+7801.293888213" watchObservedRunningTime="2026-02-01 08:52:50.654711107 +0000 UTC m=+7801.305647124" Feb 01 08:52:51 crc kubenswrapper[4546]: I0201 08:52:51.269709 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:51 crc kubenswrapper[4546]: I0201 08:52:51.270098 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:52:52 crc kubenswrapper[4546]: I0201 08:52:52.311201 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2vxl4" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="registry-server" probeResult="failure" output=< Feb 01 08:52:52 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:52:52 crc kubenswrapper[4546]: > Feb 01 08:52:56 crc kubenswrapper[4546]: I0201 08:52:56.623011 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:56 crc kubenswrapper[4546]: I0201 08:52:56.624668 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:56 crc kubenswrapper[4546]: I0201 08:52:56.679285 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:56 crc kubenswrapper[4546]: I0201 08:52:56.718293 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:56 crc kubenswrapper[4546]: I0201 08:52:56.911801 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:57 crc kubenswrapper[4546]: I0201 08:52:57.655314 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:52:57 crc kubenswrapper[4546]: E0201 08:52:57.655590 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:52:58 crc kubenswrapper[4546]: I0201 08:52:58.692738 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6qnrs" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="registry-server" containerID="cri-o://a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e" gracePeriod=2 Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.203512 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.286502 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content\") pod \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.286613 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities\") pod \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.286771 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c8zl\" (UniqueName: \"kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl\") pod \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\" (UID: \"8cdb5e7f-df02-4550-b8f5-4ad538c79929\") " Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.287408 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities" (OuterVolumeSpecName: "utilities") pod "8cdb5e7f-df02-4550-b8f5-4ad538c79929" (UID: "8cdb5e7f-df02-4550-b8f5-4ad538c79929"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.287901 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.300050 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl" (OuterVolumeSpecName: "kube-api-access-7c8zl") pod "8cdb5e7f-df02-4550-b8f5-4ad538c79929" (UID: "8cdb5e7f-df02-4550-b8f5-4ad538c79929"). InnerVolumeSpecName "kube-api-access-7c8zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.334896 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cdb5e7f-df02-4550-b8f5-4ad538c79929" (UID: "8cdb5e7f-df02-4550-b8f5-4ad538c79929"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.390369 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e7f-df02-4550-b8f5-4ad538c79929-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.390399 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c8zl\" (UniqueName: \"kubernetes.io/projected/8cdb5e7f-df02-4550-b8f5-4ad538c79929-kube-api-access-7c8zl\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.702301 4546 generic.go:334] "Generic (PLEG): container finished" podID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerID="a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e" exitCode=0 Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.702422 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerDied","Data":"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e"} Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.702547 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qnrs" event={"ID":"8cdb5e7f-df02-4550-b8f5-4ad538c79929","Type":"ContainerDied","Data":"4fa803671d40524c3f85ec9d20467ae7b72240352bebc61d8bd6b7a11efc1a5b"} Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.702569 4546 scope.go:117] "RemoveContainer" containerID="a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.702499 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qnrs" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.722263 4546 scope.go:117] "RemoveContainer" containerID="8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.724035 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.732766 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6qnrs"] Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.756251 4546 scope.go:117] "RemoveContainer" containerID="8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.827074 4546 scope.go:117] "RemoveContainer" containerID="a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e" Feb 01 08:52:59 crc kubenswrapper[4546]: E0201 08:52:59.829952 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e\": container with ID starting with a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e not found: ID does not exist" containerID="a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.829999 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e"} err="failed to get container status \"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e\": rpc error: code = NotFound desc = could not find container \"a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e\": container with ID starting with a3c32d6189b8df74766ec0ed226733398524880ba0f906bdc0123a7f4dbe818e not found: ID does not exist" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.830028 4546 scope.go:117] "RemoveContainer" containerID="8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d" Feb 01 08:52:59 crc kubenswrapper[4546]: E0201 08:52:59.830435 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d\": container with ID starting with 8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d not found: ID does not exist" containerID="8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.830541 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d"} err="failed to get container status \"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d\": rpc error: code = NotFound desc = could not find container \"8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d\": container with ID starting with 8cee01c96f10829bdcfa050b602671a28c7436381405c153668c61bf096af74d not found: ID does not exist" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.831253 4546 scope.go:117] "RemoveContainer" containerID="8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42" Feb 01 08:52:59 crc kubenswrapper[4546]: E0201 08:52:59.832011 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42\": container with ID starting with 8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42 not found: ID does not exist" containerID="8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42" Feb 01 08:52:59 crc kubenswrapper[4546]: I0201 08:52:59.832050 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42"} err="failed to get container status \"8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42\": rpc error: code = NotFound desc = could not find container \"8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42\": container with ID starting with 8eea5b31ee5001bfcf8bb59ede5891010882ee37703fe2793d9411a116d80a42 not found: ID does not exist" Feb 01 08:53:01 crc kubenswrapper[4546]: I0201 08:53:01.313585 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:53:01 crc kubenswrapper[4546]: I0201 08:53:01.357639 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:53:01 crc kubenswrapper[4546]: I0201 08:53:01.670490 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" path="/var/lib/kubelet/pods/8cdb5e7f-df02-4550-b8f5-4ad538c79929/volumes" Feb 01 08:53:02 crc kubenswrapper[4546]: I0201 08:53:02.312086 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:53:02 crc kubenswrapper[4546]: I0201 08:53:02.726600 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2vxl4" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="registry-server" containerID="cri-o://1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23" gracePeriod=2 Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.203456 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.336372 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities\") pod \"bac204a7-b749-4cdd-86e7-6cefb09cf964\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.336418 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content\") pod \"bac204a7-b749-4cdd-86e7-6cefb09cf964\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.336568 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4kqh\" (UniqueName: \"kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh\") pod \"bac204a7-b749-4cdd-86e7-6cefb09cf964\" (UID: \"bac204a7-b749-4cdd-86e7-6cefb09cf964\") " Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.336926 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities" (OuterVolumeSpecName: "utilities") pod "bac204a7-b749-4cdd-86e7-6cefb09cf964" (UID: "bac204a7-b749-4cdd-86e7-6cefb09cf964"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.337362 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.353023 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh" (OuterVolumeSpecName: "kube-api-access-p4kqh") pod "bac204a7-b749-4cdd-86e7-6cefb09cf964" (UID: "bac204a7-b749-4cdd-86e7-6cefb09cf964"). InnerVolumeSpecName "kube-api-access-p4kqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.380058 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bac204a7-b749-4cdd-86e7-6cefb09cf964" (UID: "bac204a7-b749-4cdd-86e7-6cefb09cf964"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.439262 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4kqh\" (UniqueName: \"kubernetes.io/projected/bac204a7-b749-4cdd-86e7-6cefb09cf964-kube-api-access-p4kqh\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.439292 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bac204a7-b749-4cdd-86e7-6cefb09cf964-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.739063 4546 generic.go:334] "Generic (PLEG): container finished" podID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerID="1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23" exitCode=0 Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.739123 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerDied","Data":"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23"} Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.739161 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2vxl4" event={"ID":"bac204a7-b749-4cdd-86e7-6cefb09cf964","Type":"ContainerDied","Data":"d07944cc3f3449eafcb45749128878fc2a375ed3fbc3b4e133f4bea8961fe398"} Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.739177 4546 scope.go:117] "RemoveContainer" containerID="1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.739176 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2vxl4" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.771443 4546 scope.go:117] "RemoveContainer" containerID="5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.773898 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.789072 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2vxl4"] Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.791599 4546 scope.go:117] "RemoveContainer" containerID="3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.829383 4546 scope.go:117] "RemoveContainer" containerID="1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23" Feb 01 08:53:03 crc kubenswrapper[4546]: E0201 08:53:03.829714 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23\": container with ID starting with 1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23 not found: ID does not exist" containerID="1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.829747 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23"} err="failed to get container status \"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23\": rpc error: code = NotFound desc = could not find container \"1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23\": container with ID starting with 1bdb5be44eb1059d296289170d083b358e5a0b4151e5ed77110776a2d5b85c23 not found: ID does not exist" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.829767 4546 scope.go:117] "RemoveContainer" containerID="5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4" Feb 01 08:53:03 crc kubenswrapper[4546]: E0201 08:53:03.830085 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4\": container with ID starting with 5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4 not found: ID does not exist" containerID="5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.830105 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4"} err="failed to get container status \"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4\": rpc error: code = NotFound desc = could not find container \"5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4\": container with ID starting with 5ffdc84a5325831d3a645f8397c1d7be8799d639e26293a57f5b6583183265d4 not found: ID does not exist" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.830116 4546 scope.go:117] "RemoveContainer" containerID="3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78" Feb 01 08:53:03 crc kubenswrapper[4546]: E0201 08:53:03.830331 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78\": container with ID starting with 3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78 not found: ID does not exist" containerID="3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78" Feb 01 08:53:03 crc kubenswrapper[4546]: I0201 08:53:03.830351 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78"} err="failed to get container status \"3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78\": rpc error: code = NotFound desc = could not find container \"3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78\": container with ID starting with 3e68e06af0d7213416c360325a2589e7a5d8ec929955591c5b1f0e4020b8cd78 not found: ID does not exist" Feb 01 08:53:05 crc kubenswrapper[4546]: I0201 08:53:05.664464 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" path="/var/lib/kubelet/pods/bac204a7-b749-4cdd-86e7-6cefb09cf964/volumes" Feb 01 08:53:12 crc kubenswrapper[4546]: I0201 08:53:12.655274 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:53:12 crc kubenswrapper[4546]: E0201 08:53:12.655967 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:53:27 crc kubenswrapper[4546]: I0201 08:53:27.655702 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:53:27 crc kubenswrapper[4546]: E0201 08:53:27.656639 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:53:42 crc kubenswrapper[4546]: I0201 08:53:42.655752 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:53:42 crc kubenswrapper[4546]: E0201 08:53:42.656554 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:53:57 crc kubenswrapper[4546]: I0201 08:53:57.655124 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:53:57 crc kubenswrapper[4546]: E0201 08:53:57.655947 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:54:10 crc kubenswrapper[4546]: I0201 08:54:10.654677 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:54:10 crc kubenswrapper[4546]: E0201 08:54:10.655541 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:54:21 crc kubenswrapper[4546]: I0201 08:54:21.655133 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:54:21 crc kubenswrapper[4546]: E0201 08:54:21.656134 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:54:32 crc kubenswrapper[4546]: I0201 08:54:32.655764 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:54:32 crc kubenswrapper[4546]: E0201 08:54:32.656810 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:54:43 crc kubenswrapper[4546]: I0201 08:54:43.655115 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:54:43 crc kubenswrapper[4546]: E0201 08:54:43.655952 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:54:58 crc kubenswrapper[4546]: I0201 08:54:58.655156 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:54:58 crc kubenswrapper[4546]: E0201 08:54:58.656271 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:55:09 crc kubenswrapper[4546]: I0201 08:55:09.655125 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:55:09 crc kubenswrapper[4546]: E0201 08:55:09.655932 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:55:21 crc kubenswrapper[4546]: I0201 08:55:21.657024 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:55:21 crc kubenswrapper[4546]: E0201 08:55:21.658197 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:55:33 crc kubenswrapper[4546]: I0201 08:55:33.655239 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:55:33 crc kubenswrapper[4546]: E0201 08:55:33.656464 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:55:48 crc kubenswrapper[4546]: I0201 08:55:48.654638 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:55:48 crc kubenswrapper[4546]: E0201 08:55:48.655438 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.370009 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.378904 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.378942 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.378978 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="extract-content" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.378986 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="extract-content" Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.378997 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="extract-content" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.379003 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="extract-content" Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.379019 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.379025 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.379063 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="extract-utilities" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.379069 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="extract-utilities" Feb 01 08:55:51 crc kubenswrapper[4546]: E0201 08:55:51.379090 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="extract-utilities" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.379098 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="extract-utilities" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.381358 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdb5e7f-df02-4550-b8f5-4ad538c79929" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.381405 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac204a7-b749-4cdd-86e7-6cefb09cf964" containerName="registry-server" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.396830 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.425813 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c525\" (UniqueName: \"kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.426014 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.426178 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.431970 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.528603 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c525\" (UniqueName: \"kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.528939 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.529124 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.529318 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.529514 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.548273 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c525\" (UniqueName: \"kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525\") pod \"redhat-operators-ng89n\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:51 crc kubenswrapper[4546]: I0201 08:55:51.734296 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:55:52 crc kubenswrapper[4546]: I0201 08:55:52.713102 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:55:52 crc kubenswrapper[4546]: W0201 08:55:52.726258 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8023917_dd89_4b1e_af81_cce55e771710.slice/crio-85ab40944363deadab3afae0f1a04f1850665630bf94608e0e86e2618666e03a WatchSource:0}: Error finding container 85ab40944363deadab3afae0f1a04f1850665630bf94608e0e86e2618666e03a: Status 404 returned error can't find the container with id 85ab40944363deadab3afae0f1a04f1850665630bf94608e0e86e2618666e03a Feb 01 08:55:53 crc kubenswrapper[4546]: I0201 08:55:53.206892 4546 generic.go:334] "Generic (PLEG): container finished" podID="e8023917-dd89-4b1e-af81-cce55e771710" containerID="14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e" exitCode=0 Feb 01 08:55:53 crc kubenswrapper[4546]: I0201 08:55:53.206946 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerDied","Data":"14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e"} Feb 01 08:55:53 crc kubenswrapper[4546]: I0201 08:55:53.206978 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerStarted","Data":"85ab40944363deadab3afae0f1a04f1850665630bf94608e0e86e2618666e03a"} Feb 01 08:55:54 crc kubenswrapper[4546]: I0201 08:55:54.217200 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerStarted","Data":"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd"} Feb 01 08:55:57 crc kubenswrapper[4546]: I0201 08:55:57.245246 4546 generic.go:334] "Generic (PLEG): container finished" podID="e8023917-dd89-4b1e-af81-cce55e771710" containerID="2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd" exitCode=0 Feb 01 08:55:57 crc kubenswrapper[4546]: I0201 08:55:57.245338 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerDied","Data":"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd"} Feb 01 08:55:58 crc kubenswrapper[4546]: I0201 08:55:58.259690 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerStarted","Data":"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97"} Feb 01 08:55:58 crc kubenswrapper[4546]: I0201 08:55:58.285133 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ng89n" podStartSLOduration=2.751803262 podStartE2EDuration="7.284658117s" podCreationTimestamp="2026-02-01 08:55:51 +0000 UTC" firstStartedPulling="2026-02-01 08:55:53.209467988 +0000 UTC m=+7983.860404004" lastFinishedPulling="2026-02-01 08:55:57.742322843 +0000 UTC m=+7988.393258859" observedRunningTime="2026-02-01 08:55:58.281499673 +0000 UTC m=+7988.932435688" watchObservedRunningTime="2026-02-01 08:55:58.284658117 +0000 UTC m=+7988.935594123" Feb 01 08:56:01 crc kubenswrapper[4546]: I0201 08:56:01.734577 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:01 crc kubenswrapper[4546]: I0201 08:56:01.735454 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:02 crc kubenswrapper[4546]: I0201 08:56:02.655676 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:56:02 crc kubenswrapper[4546]: E0201 08:56:02.656333 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:56:02 crc kubenswrapper[4546]: I0201 08:56:02.773677 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ng89n" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="registry-server" probeResult="failure" output=< Feb 01 08:56:02 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 08:56:02 crc kubenswrapper[4546]: > Feb 01 08:56:11 crc kubenswrapper[4546]: I0201 08:56:11.774443 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:11 crc kubenswrapper[4546]: I0201 08:56:11.816937 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:12 crc kubenswrapper[4546]: I0201 08:56:12.008561 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.415267 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ng89n" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="registry-server" containerID="cri-o://c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97" gracePeriod=2 Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.923326 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.991165 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c525\" (UniqueName: \"kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525\") pod \"e8023917-dd89-4b1e-af81-cce55e771710\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.991395 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities\") pod \"e8023917-dd89-4b1e-af81-cce55e771710\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.991493 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content\") pod \"e8023917-dd89-4b1e-af81-cce55e771710\" (UID: \"e8023917-dd89-4b1e-af81-cce55e771710\") " Feb 01 08:56:13 crc kubenswrapper[4546]: I0201 08:56:13.993122 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities" (OuterVolumeSpecName: "utilities") pod "e8023917-dd89-4b1e-af81-cce55e771710" (UID: "e8023917-dd89-4b1e-af81-cce55e771710"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.013097 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525" (OuterVolumeSpecName: "kube-api-access-2c525") pod "e8023917-dd89-4b1e-af81-cce55e771710" (UID: "e8023917-dd89-4b1e-af81-cce55e771710"). InnerVolumeSpecName "kube-api-access-2c525". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.095080 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c525\" (UniqueName: \"kubernetes.io/projected/e8023917-dd89-4b1e-af81-cce55e771710-kube-api-access-2c525\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.095111 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.108155 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8023917-dd89-4b1e-af81-cce55e771710" (UID: "e8023917-dd89-4b1e-af81-cce55e771710"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.197451 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8023917-dd89-4b1e-af81-cce55e771710-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.425111 4546 generic.go:334] "Generic (PLEG): container finished" podID="e8023917-dd89-4b1e-af81-cce55e771710" containerID="c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97" exitCode=0 Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.425168 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ng89n" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.425188 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerDied","Data":"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97"} Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.425495 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ng89n" event={"ID":"e8023917-dd89-4b1e-af81-cce55e771710","Type":"ContainerDied","Data":"85ab40944363deadab3afae0f1a04f1850665630bf94608e0e86e2618666e03a"} Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.425515 4546 scope.go:117] "RemoveContainer" containerID="c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.457422 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.459491 4546 scope.go:117] "RemoveContainer" containerID="2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.465968 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ng89n"] Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.481049 4546 scope.go:117] "RemoveContainer" containerID="14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.515553 4546 scope.go:117] "RemoveContainer" containerID="c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97" Feb 01 08:56:14 crc kubenswrapper[4546]: E0201 08:56:14.517117 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97\": container with ID starting with c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97 not found: ID does not exist" containerID="c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.517747 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97"} err="failed to get container status \"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97\": rpc error: code = NotFound desc = could not find container \"c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97\": container with ID starting with c3ab9e46d596cc7a7533378b989e8791b72b65d1233ced5878a5e19025de4a97 not found: ID does not exist" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.517790 4546 scope.go:117] "RemoveContainer" containerID="2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd" Feb 01 08:56:14 crc kubenswrapper[4546]: E0201 08:56:14.518145 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd\": container with ID starting with 2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd not found: ID does not exist" containerID="2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.518171 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd"} err="failed to get container status \"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd\": rpc error: code = NotFound desc = could not find container \"2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd\": container with ID starting with 2be4dcc85c24534153ce139005a8565fb129da66c0f59117e3a0b3c8880388cd not found: ID does not exist" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.518211 4546 scope.go:117] "RemoveContainer" containerID="14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e" Feb 01 08:56:14 crc kubenswrapper[4546]: E0201 08:56:14.518724 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e\": container with ID starting with 14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e not found: ID does not exist" containerID="14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e" Feb 01 08:56:14 crc kubenswrapper[4546]: I0201 08:56:14.518752 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e"} err="failed to get container status \"14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e\": rpc error: code = NotFound desc = could not find container \"14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e\": container with ID starting with 14958ba22abd45c50214dffadda873d2ac648764a51608048c314c6ea935951e not found: ID does not exist" Feb 01 08:56:15 crc kubenswrapper[4546]: I0201 08:56:15.655096 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:56:15 crc kubenswrapper[4546]: E0201 08:56:15.655513 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:56:15 crc kubenswrapper[4546]: I0201 08:56:15.669204 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8023917-dd89-4b1e-af81-cce55e771710" path="/var/lib/kubelet/pods/e8023917-dd89-4b1e-af81-cce55e771710/volumes" Feb 01 08:56:30 crc kubenswrapper[4546]: I0201 08:56:30.654509 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:56:30 crc kubenswrapper[4546]: E0201 08:56:30.655497 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:56:41 crc kubenswrapper[4546]: I0201 08:56:41.657072 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:56:41 crc kubenswrapper[4546]: E0201 08:56:41.657941 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:56:53 crc kubenswrapper[4546]: I0201 08:56:53.655985 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:56:53 crc kubenswrapper[4546]: E0201 08:56:53.657345 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 08:57:06 crc kubenswrapper[4546]: I0201 08:57:06.656314 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 08:57:06 crc kubenswrapper[4546]: I0201 08:57:06.889580 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7"} Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.221763 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:57:55 crc kubenswrapper[4546]: E0201 08:57:55.227117 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="extract-utilities" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.227152 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="extract-utilities" Feb 01 08:57:55 crc kubenswrapper[4546]: E0201 08:57:55.227179 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="extract-content" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.227188 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="extract-content" Feb 01 08:57:55 crc kubenswrapper[4546]: E0201 08:57:55.227230 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="registry-server" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.227238 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="registry-server" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.227771 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8023917-dd89-4b1e-af81-cce55e771710" containerName="registry-server" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.230769 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.238627 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.406794 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf47p\" (UniqueName: \"kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.406906 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.407080 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.509270 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf47p\" (UniqueName: \"kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.509350 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.509434 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.510060 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.510313 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.528778 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf47p\" (UniqueName: \"kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p\") pod \"redhat-marketplace-4cxwv\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:55 crc kubenswrapper[4546]: I0201 08:57:55.551322 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:57:56 crc kubenswrapper[4546]: I0201 08:57:56.046737 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:57:56 crc kubenswrapper[4546]: I0201 08:57:56.369419 4546 generic.go:334] "Generic (PLEG): container finished" podID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerID="0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e" exitCode=0 Feb 01 08:57:56 crc kubenswrapper[4546]: I0201 08:57:56.369474 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerDied","Data":"0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e"} Feb 01 08:57:56 crc kubenswrapper[4546]: I0201 08:57:56.369501 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerStarted","Data":"d275279cdc250fd75d538d6c3431c68f793924a4e6519dac6cff24f9072d39fe"} Feb 01 08:57:56 crc kubenswrapper[4546]: I0201 08:57:56.371618 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:57:57 crc kubenswrapper[4546]: I0201 08:57:57.379620 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerStarted","Data":"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e"} Feb 01 08:57:58 crc kubenswrapper[4546]: I0201 08:57:58.389742 4546 generic.go:334] "Generic (PLEG): container finished" podID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerID="fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e" exitCode=0 Feb 01 08:57:58 crc kubenswrapper[4546]: I0201 08:57:58.389810 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerDied","Data":"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e"} Feb 01 08:57:59 crc kubenswrapper[4546]: I0201 08:57:59.403139 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerStarted","Data":"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325"} Feb 01 08:57:59 crc kubenswrapper[4546]: I0201 08:57:59.425351 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4cxwv" podStartSLOduration=1.913793584 podStartE2EDuration="4.425330542s" podCreationTimestamp="2026-02-01 08:57:55 +0000 UTC" firstStartedPulling="2026-02-01 08:57:56.370794467 +0000 UTC m=+8107.021730474" lastFinishedPulling="2026-02-01 08:57:58.882331425 +0000 UTC m=+8109.533267432" observedRunningTime="2026-02-01 08:57:59.418650027 +0000 UTC m=+8110.069586043" watchObservedRunningTime="2026-02-01 08:57:59.425330542 +0000 UTC m=+8110.076266558" Feb 01 08:58:05 crc kubenswrapper[4546]: I0201 08:58:05.551552 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:05 crc kubenswrapper[4546]: I0201 08:58:05.553297 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:05 crc kubenswrapper[4546]: I0201 08:58:05.597331 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:06 crc kubenswrapper[4546]: I0201 08:58:06.497612 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:06 crc kubenswrapper[4546]: I0201 08:58:06.552406 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:58:08 crc kubenswrapper[4546]: I0201 08:58:08.477405 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4cxwv" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="registry-server" containerID="cri-o://cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325" gracePeriod=2 Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.009082 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.123066 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content\") pod \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.123956 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf47p\" (UniqueName: \"kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p\") pod \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.124223 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities\") pod \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\" (UID: \"dcc9b4ff-df6f-4d06-b70b-e1d05c671903\") " Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.124770 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities" (OuterVolumeSpecName: "utilities") pod "dcc9b4ff-df6f-4d06-b70b-e1d05c671903" (UID: "dcc9b4ff-df6f-4d06-b70b-e1d05c671903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.125364 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.132105 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p" (OuterVolumeSpecName: "kube-api-access-hf47p") pod "dcc9b4ff-df6f-4d06-b70b-e1d05c671903" (UID: "dcc9b4ff-df6f-4d06-b70b-e1d05c671903"). InnerVolumeSpecName "kube-api-access-hf47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.143294 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcc9b4ff-df6f-4d06-b70b-e1d05c671903" (UID: "dcc9b4ff-df6f-4d06-b70b-e1d05c671903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.227765 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.227796 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf47p\" (UniqueName: \"kubernetes.io/projected/dcc9b4ff-df6f-4d06-b70b-e1d05c671903-kube-api-access-hf47p\") on node \"crc\" DevicePath \"\"" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.495956 4546 generic.go:334] "Generic (PLEG): container finished" podID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerID="cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325" exitCode=0 Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.496010 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerDied","Data":"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325"} Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.496041 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cxwv" event={"ID":"dcc9b4ff-df6f-4d06-b70b-e1d05c671903","Type":"ContainerDied","Data":"d275279cdc250fd75d538d6c3431c68f793924a4e6519dac6cff24f9072d39fe"} Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.496060 4546 scope.go:117] "RemoveContainer" containerID="cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.496192 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cxwv" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.524026 4546 scope.go:117] "RemoveContainer" containerID="fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.534181 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.543514 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cxwv"] Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.548264 4546 scope.go:117] "RemoveContainer" containerID="0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.588577 4546 scope.go:117] "RemoveContainer" containerID="cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325" Feb 01 08:58:09 crc kubenswrapper[4546]: E0201 08:58:09.588941 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325\": container with ID starting with cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325 not found: ID does not exist" containerID="cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.588977 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325"} err="failed to get container status \"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325\": rpc error: code = NotFound desc = could not find container \"cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325\": container with ID starting with cea3d453c4a09cc16f4608a87665b3a09476b9f786db893d02c71eb5b34ca325 not found: ID does not exist" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.589003 4546 scope.go:117] "RemoveContainer" containerID="fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e" Feb 01 08:58:09 crc kubenswrapper[4546]: E0201 08:58:09.589304 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e\": container with ID starting with fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e not found: ID does not exist" containerID="fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.589337 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e"} err="failed to get container status \"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e\": rpc error: code = NotFound desc = could not find container \"fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e\": container with ID starting with fa6a2fb7bea4c6db21d378c4d4a2b211df57186e59c007068fde7e2a3e0e2c3e not found: ID does not exist" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.589363 4546 scope.go:117] "RemoveContainer" containerID="0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e" Feb 01 08:58:09 crc kubenswrapper[4546]: E0201 08:58:09.589600 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e\": container with ID starting with 0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e not found: ID does not exist" containerID="0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.589639 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e"} err="failed to get container status \"0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e\": rpc error: code = NotFound desc = could not find container \"0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e\": container with ID starting with 0d90b2b5fce908fb15499ee0937ab258a27281f4489b74216fc2a95b74b7200e not found: ID does not exist" Feb 01 08:58:09 crc kubenswrapper[4546]: I0201 08:58:09.665146 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" path="/var/lib/kubelet/pods/dcc9b4ff-df6f-4d06-b70b-e1d05c671903/volumes" Feb 01 08:59:25 crc kubenswrapper[4546]: I0201 08:59:25.420872 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:59:25 crc kubenswrapper[4546]: I0201 08:59:25.421843 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:59:55 crc kubenswrapper[4546]: I0201 08:59:55.421133 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:59:55 crc kubenswrapper[4546]: I0201 08:59:55.422977 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.221634 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4"] Feb 01 09:00:00 crc kubenswrapper[4546]: E0201 09:00:00.222727 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="extract-utilities" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.222744 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="extract-utilities" Feb 01 09:00:00 crc kubenswrapper[4546]: E0201 09:00:00.222771 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="registry-server" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.222777 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="registry-server" Feb 01 09:00:00 crc kubenswrapper[4546]: E0201 09:00:00.222790 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="extract-content" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.222795 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="extract-content" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.223027 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc9b4ff-df6f-4d06-b70b-e1d05c671903" containerName="registry-server" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.224033 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.234331 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.234336 4546 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.239525 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4"] Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.287640 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.288310 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrv9\" (UniqueName: \"kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.288494 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.390877 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.391010 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrv9\" (UniqueName: \"kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.391075 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.391826 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.397573 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.408818 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrv9\" (UniqueName: \"kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9\") pod \"collect-profiles-29498940-t8jt4\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.545031 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:00 crc kubenswrapper[4546]: I0201 09:00:00.987915 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4"] Feb 01 09:00:01 crc kubenswrapper[4546]: I0201 09:00:01.649795 4546 generic.go:334] "Generic (PLEG): container finished" podID="ab5faba6-ac0f-409b-a113-b4293ace60e1" containerID="a85b57b60d00b8614e13279c2406aadaa0b4a5f6fcaad7332ab60a12c44c65fa" exitCode=0 Feb 01 09:00:01 crc kubenswrapper[4546]: I0201 09:00:01.650490 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" event={"ID":"ab5faba6-ac0f-409b-a113-b4293ace60e1","Type":"ContainerDied","Data":"a85b57b60d00b8614e13279c2406aadaa0b4a5f6fcaad7332ab60a12c44c65fa"} Feb 01 09:00:01 crc kubenswrapper[4546]: I0201 09:00:01.650526 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" event={"ID":"ab5faba6-ac0f-409b-a113-b4293ace60e1","Type":"ContainerStarted","Data":"351e42fa4a9560efdbcae630ea6b13cbb6e1c88a9d6ca71acb6c0fd7b54fb275"} Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.092221 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.173025 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume\") pod \"ab5faba6-ac0f-409b-a113-b4293ace60e1\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.173254 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume\") pod \"ab5faba6-ac0f-409b-a113-b4293ace60e1\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.173409 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxrv9\" (UniqueName: \"kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9\") pod \"ab5faba6-ac0f-409b-a113-b4293ace60e1\" (UID: \"ab5faba6-ac0f-409b-a113-b4293ace60e1\") " Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.173920 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab5faba6-ac0f-409b-a113-b4293ace60e1" (UID: "ab5faba6-ac0f-409b-a113-b4293ace60e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.174558 4546 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab5faba6-ac0f-409b-a113-b4293ace60e1-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.196087 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab5faba6-ac0f-409b-a113-b4293ace60e1" (UID: "ab5faba6-ac0f-409b-a113-b4293ace60e1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.196155 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9" (OuterVolumeSpecName: "kube-api-access-mxrv9") pod "ab5faba6-ac0f-409b-a113-b4293ace60e1" (UID: "ab5faba6-ac0f-409b-a113-b4293ace60e1"). InnerVolumeSpecName "kube-api-access-mxrv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.278018 4546 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab5faba6-ac0f-409b-a113-b4293ace60e1-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.278060 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxrv9\" (UniqueName: \"kubernetes.io/projected/ab5faba6-ac0f-409b-a113-b4293ace60e1-kube-api-access-mxrv9\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.675466 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" event={"ID":"ab5faba6-ac0f-409b-a113-b4293ace60e1","Type":"ContainerDied","Data":"351e42fa4a9560efdbcae630ea6b13cbb6e1c88a9d6ca71acb6c0fd7b54fb275"} Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.675507 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="351e42fa4a9560efdbcae630ea6b13cbb6e1c88a9d6ca71acb6c0fd7b54fb275" Feb 01 09:00:03 crc kubenswrapper[4546]: I0201 09:00:03.675569 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-t8jt4" Feb 01 09:00:04 crc kubenswrapper[4546]: I0201 09:00:04.179074 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j"] Feb 01 09:00:04 crc kubenswrapper[4546]: I0201 09:00:04.186236 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-6sz5j"] Feb 01 09:00:05 crc kubenswrapper[4546]: I0201 09:00:05.667644 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e154031-2652-43f7-8970-38bd3e61f165" path="/var/lib/kubelet/pods/5e154031-2652-43f7-8970-38bd3e61f165/volumes" Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.421544 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.422360 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.422436 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.424164 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.424268 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7" gracePeriod=600 Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.912878 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7" exitCode=0 Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.912893 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7"} Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.913420 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86"} Feb 01 09:00:25 crc kubenswrapper[4546]: I0201 09:00:25.913446 4546 scope.go:117] "RemoveContainer" containerID="683b0a09033d035a330ba5fef6b6a062d2c084970d6d7fa1d24bb16260c18b75" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.189107 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29498941-58x9q"] Feb 01 09:01:00 crc kubenswrapper[4546]: E0201 09:01:00.191914 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5faba6-ac0f-409b-a113-b4293ace60e1" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.192002 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5faba6-ac0f-409b-a113-b4293ace60e1" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.192265 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5faba6-ac0f-409b-a113-b4293ace60e1" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.193201 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.210333 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498941-58x9q"] Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.350893 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.351225 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.351457 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.351561 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxjk\" (UniqueName: \"kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.454192 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brxjk\" (UniqueName: \"kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.454389 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.454422 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.454620 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.462718 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.462813 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.467940 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.471761 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxjk\" (UniqueName: \"kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk\") pod \"keystone-cron-29498941-58x9q\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.519598 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:00 crc kubenswrapper[4546]: I0201 09:01:00.559322 4546 scope.go:117] "RemoveContainer" containerID="721227fa12378e3bb777b268c4708b2140feba5007d4f41f47ca8dda57fe0a16" Feb 01 09:01:01 crc kubenswrapper[4546]: I0201 09:01:01.010497 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498941-58x9q"] Feb 01 09:01:01 crc kubenswrapper[4546]: I0201 09:01:01.244927 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-58x9q" event={"ID":"96c771b6-03d4-45e4-85ff-cfdc7ff5235e","Type":"ContainerStarted","Data":"a7cb66aae51c6150a0315416f50c9c42c9ec68182bbff1931a68741f400ba708"} Feb 01 09:01:01 crc kubenswrapper[4546]: I0201 09:01:01.245400 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-58x9q" event={"ID":"96c771b6-03d4-45e4-85ff-cfdc7ff5235e","Type":"ContainerStarted","Data":"f57e5489a1a279bca5caeb043b8ae0ee8bc6b404247a673b776e32ac1b6748fa"} Feb 01 09:01:01 crc kubenswrapper[4546]: I0201 09:01:01.268490 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29498941-58x9q" podStartSLOduration=1.268461201 podStartE2EDuration="1.268461201s" podCreationTimestamp="2026-02-01 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:01:01.260489241 +0000 UTC m=+8291.911425257" watchObservedRunningTime="2026-02-01 09:01:01.268461201 +0000 UTC m=+8291.919397237" Feb 01 09:01:04 crc kubenswrapper[4546]: I0201 09:01:04.273676 4546 generic.go:334] "Generic (PLEG): container finished" podID="96c771b6-03d4-45e4-85ff-cfdc7ff5235e" containerID="a7cb66aae51c6150a0315416f50c9c42c9ec68182bbff1931a68741f400ba708" exitCode=0 Feb 01 09:01:04 crc kubenswrapper[4546]: I0201 09:01:04.273775 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-58x9q" event={"ID":"96c771b6-03d4-45e4-85ff-cfdc7ff5235e","Type":"ContainerDied","Data":"a7cb66aae51c6150a0315416f50c9c42c9ec68182bbff1931a68741f400ba708"} Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.639730 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.698294 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle\") pod \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.698507 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brxjk\" (UniqueName: \"kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk\") pod \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.698560 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys\") pod \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.698879 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data\") pod \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\" (UID: \"96c771b6-03d4-45e4-85ff-cfdc7ff5235e\") " Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.708074 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk" (OuterVolumeSpecName: "kube-api-access-brxjk") pod "96c771b6-03d4-45e4-85ff-cfdc7ff5235e" (UID: "96c771b6-03d4-45e4-85ff-cfdc7ff5235e"). InnerVolumeSpecName "kube-api-access-brxjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.713157 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "96c771b6-03d4-45e4-85ff-cfdc7ff5235e" (UID: "96c771b6-03d4-45e4-85ff-cfdc7ff5235e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.743109 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96c771b6-03d4-45e4-85ff-cfdc7ff5235e" (UID: "96c771b6-03d4-45e4-85ff-cfdc7ff5235e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.756943 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data" (OuterVolumeSpecName: "config-data") pod "96c771b6-03d4-45e4-85ff-cfdc7ff5235e" (UID: "96c771b6-03d4-45e4-85ff-cfdc7ff5235e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.803058 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brxjk\" (UniqueName: \"kubernetes.io/projected/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-kube-api-access-brxjk\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.803171 4546 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.803231 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[4546]: I0201 09:01:05.803282 4546 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96c771b6-03d4-45e4-85ff-cfdc7ff5235e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:06 crc kubenswrapper[4546]: I0201 09:01:06.295371 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-58x9q" event={"ID":"96c771b6-03d4-45e4-85ff-cfdc7ff5235e","Type":"ContainerDied","Data":"f57e5489a1a279bca5caeb043b8ae0ee8bc6b404247a673b776e32ac1b6748fa"} Feb 01 09:01:06 crc kubenswrapper[4546]: I0201 09:01:06.295425 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57e5489a1a279bca5caeb043b8ae0ee8bc6b404247a673b776e32ac1b6748fa" Feb 01 09:01:06 crc kubenswrapper[4546]: I0201 09:01:06.295449 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-58x9q" Feb 01 09:02:25 crc kubenswrapper[4546]: I0201 09:02:25.420579 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:02:25 crc kubenswrapper[4546]: I0201 09:02:25.421240 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:02:55 crc kubenswrapper[4546]: I0201 09:02:55.420572 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:02:55 crc kubenswrapper[4546]: I0201 09:02:55.422011 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.867158 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:08 crc kubenswrapper[4546]: E0201 09:03:08.868068 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c771b6-03d4-45e4-85ff-cfdc7ff5235e" containerName="keystone-cron" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.868087 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c771b6-03d4-45e4-85ff-cfdc7ff5235e" containerName="keystone-cron" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.868367 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c771b6-03d4-45e4-85ff-cfdc7ff5235e" containerName="keystone-cron" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.877167 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.889600 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.900705 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.900867 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:08 crc kubenswrapper[4546]: I0201 09:03:08.901154 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6l4\" (UniqueName: \"kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.002890 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.003216 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.003409 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.003613 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb6l4\" (UniqueName: \"kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.003622 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.021601 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb6l4\" (UniqueName: \"kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4\") pod \"community-operators-wzl5f\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.195876 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:09 crc kubenswrapper[4546]: I0201 09:03:09.969618 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:10 crc kubenswrapper[4546]: I0201 09:03:10.494832 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerDied","Data":"3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b"} Feb 01 09:03:10 crc kubenswrapper[4546]: I0201 09:03:10.496415 4546 generic.go:334] "Generic (PLEG): container finished" podID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerID="3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b" exitCode=0 Feb 01 09:03:10 crc kubenswrapper[4546]: I0201 09:03:10.496477 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerStarted","Data":"57b63a2068ce3848bb7fc394e430ea4bb3844ad8781df86b1f3527af05135c8f"} Feb 01 09:03:10 crc kubenswrapper[4546]: I0201 09:03:10.500572 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:03:11 crc kubenswrapper[4546]: I0201 09:03:11.508848 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerStarted","Data":"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60"} Feb 01 09:03:12 crc kubenswrapper[4546]: I0201 09:03:12.522961 4546 generic.go:334] "Generic (PLEG): container finished" podID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerID="ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60" exitCode=0 Feb 01 09:03:12 crc kubenswrapper[4546]: I0201 09:03:12.523017 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerDied","Data":"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60"} Feb 01 09:03:13 crc kubenswrapper[4546]: I0201 09:03:13.534735 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerStarted","Data":"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7"} Feb 01 09:03:13 crc kubenswrapper[4546]: I0201 09:03:13.560649 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzl5f" podStartSLOduration=2.988454655 podStartE2EDuration="5.559579827s" podCreationTimestamp="2026-02-01 09:03:08 +0000 UTC" firstStartedPulling="2026-02-01 09:03:10.497002794 +0000 UTC m=+8421.147938809" lastFinishedPulling="2026-02-01 09:03:13.068127964 +0000 UTC m=+8423.719063981" observedRunningTime="2026-02-01 09:03:13.551373146 +0000 UTC m=+8424.202309162" watchObservedRunningTime="2026-02-01 09:03:13.559579827 +0000 UTC m=+8424.210515844" Feb 01 09:03:19 crc kubenswrapper[4546]: I0201 09:03:19.196939 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:19 crc kubenswrapper[4546]: I0201 09:03:19.197623 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:19 crc kubenswrapper[4546]: I0201 09:03:19.241645 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:19 crc kubenswrapper[4546]: I0201 09:03:19.631900 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:19 crc kubenswrapper[4546]: I0201 09:03:19.681897 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:21 crc kubenswrapper[4546]: I0201 09:03:21.623758 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wzl5f" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="registry-server" containerID="cri-o://038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7" gracePeriod=2 Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.124276 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.222519 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb6l4\" (UniqueName: \"kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4\") pod \"a36c644b-9ae1-431a-8e4b-d4527509372d\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.223102 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content\") pod \"a36c644b-9ae1-431a-8e4b-d4527509372d\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.223298 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities\") pod \"a36c644b-9ae1-431a-8e4b-d4527509372d\" (UID: \"a36c644b-9ae1-431a-8e4b-d4527509372d\") " Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.224502 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities" (OuterVolumeSpecName: "utilities") pod "a36c644b-9ae1-431a-8e4b-d4527509372d" (UID: "a36c644b-9ae1-431a-8e4b-d4527509372d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.236983 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4" (OuterVolumeSpecName: "kube-api-access-mb6l4") pod "a36c644b-9ae1-431a-8e4b-d4527509372d" (UID: "a36c644b-9ae1-431a-8e4b-d4527509372d"). InnerVolumeSpecName "kube-api-access-mb6l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.271722 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a36c644b-9ae1-431a-8e4b-d4527509372d" (UID: "a36c644b-9ae1-431a-8e4b-d4527509372d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.326477 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.326520 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb6l4\" (UniqueName: \"kubernetes.io/projected/a36c644b-9ae1-431a-8e4b-d4527509372d-kube-api-access-mb6l4\") on node \"crc\" DevicePath \"\"" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.326536 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36c644b-9ae1-431a-8e4b-d4527509372d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.636364 4546 generic.go:334] "Generic (PLEG): container finished" podID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerID="038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7" exitCode=0 Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.636418 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerDied","Data":"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7"} Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.636764 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzl5f" event={"ID":"a36c644b-9ae1-431a-8e4b-d4527509372d","Type":"ContainerDied","Data":"57b63a2068ce3848bb7fc394e430ea4bb3844ad8781df86b1f3527af05135c8f"} Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.636459 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzl5f" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.637226 4546 scope.go:117] "RemoveContainer" containerID="038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.675829 4546 scope.go:117] "RemoveContainer" containerID="ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.683806 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.697475 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wzl5f"] Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.707139 4546 scope.go:117] "RemoveContainer" containerID="3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.743046 4546 scope.go:117] "RemoveContainer" containerID="038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7" Feb 01 09:03:22 crc kubenswrapper[4546]: E0201 09:03:22.745699 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7\": container with ID starting with 038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7 not found: ID does not exist" containerID="038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.746345 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7"} err="failed to get container status \"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7\": rpc error: code = NotFound desc = could not find container \"038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7\": container with ID starting with 038c9c8d4f90fefb19e9a88ec65f3a940d0f84274d4f41f44e707e9b882ef4c7 not found: ID does not exist" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.746382 4546 scope.go:117] "RemoveContainer" containerID="ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60" Feb 01 09:03:22 crc kubenswrapper[4546]: E0201 09:03:22.746881 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60\": container with ID starting with ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60 not found: ID does not exist" containerID="ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.746903 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60"} err="failed to get container status \"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60\": rpc error: code = NotFound desc = could not find container \"ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60\": container with ID starting with ce98f3f1ecfebada9be37ec2f5e267833aee8db49943138cbcda04ae87cfee60 not found: ID does not exist" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.746915 4546 scope.go:117] "RemoveContainer" containerID="3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b" Feb 01 09:03:22 crc kubenswrapper[4546]: E0201 09:03:22.749941 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b\": container with ID starting with 3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b not found: ID does not exist" containerID="3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b" Feb 01 09:03:22 crc kubenswrapper[4546]: I0201 09:03:22.750054 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b"} err="failed to get container status \"3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b\": rpc error: code = NotFound desc = could not find container \"3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b\": container with ID starting with 3da0dc2ddb023a9eb5fa03044f080fae434f6157c8eded31a539d51bc203155b not found: ID does not exist" Feb 01 09:03:23 crc kubenswrapper[4546]: I0201 09:03:23.665389 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" path="/var/lib/kubelet/pods/a36c644b-9ae1-431a-8e4b-d4527509372d/volumes" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.421463 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.421946 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.422022 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.423287 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.423364 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" gracePeriod=600 Feb 01 09:03:25 crc kubenswrapper[4546]: E0201 09:03:25.549228 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.675266 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" exitCode=0 Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.675323 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86"} Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.675366 4546 scope.go:117] "RemoveContainer" containerID="5b5a496740023438907fdebf353d928489e597bb30c2005e4d201ba326abedf7" Feb 01 09:03:25 crc kubenswrapper[4546]: I0201 09:03:25.675870 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:03:25 crc kubenswrapper[4546]: E0201 09:03:25.676158 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:03:37 crc kubenswrapper[4546]: I0201 09:03:37.657481 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:03:37 crc kubenswrapper[4546]: E0201 09:03:37.658992 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:03:52 crc kubenswrapper[4546]: I0201 09:03:52.656267 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:03:52 crc kubenswrapper[4546]: E0201 09:03:52.657378 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.159091 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:02 crc kubenswrapper[4546]: E0201 09:04:02.160801 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="registry-server" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.160824 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="registry-server" Feb 01 09:04:02 crc kubenswrapper[4546]: E0201 09:04:02.160915 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="extract-content" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.160924 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="extract-content" Feb 01 09:04:02 crc kubenswrapper[4546]: E0201 09:04:02.160957 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="extract-utilities" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.160965 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="extract-utilities" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.161331 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36c644b-9ae1-431a-8e4b-d4527509372d" containerName="registry-server" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.164547 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.192731 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.351003 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fqd\" (UniqueName: \"kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.351116 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.351495 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.453764 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fqd\" (UniqueName: \"kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.453838 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.454075 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.454569 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.454602 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.471508 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fqd\" (UniqueName: \"kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd\") pod \"certified-operators-6qflc\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.490678 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:02 crc kubenswrapper[4546]: I0201 09:04:02.985910 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:03 crc kubenswrapper[4546]: I0201 09:04:03.067438 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerStarted","Data":"1b656c26e0df67401fa8388b237c8387231af6e8dbf44d0d78056ca0f45aa4da"} Feb 01 09:04:04 crc kubenswrapper[4546]: I0201 09:04:04.076765 4546 generic.go:334] "Generic (PLEG): container finished" podID="e75c434f-60d7-4936-b336-1ec6f084889e" containerID="9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224" exitCode=0 Feb 01 09:04:04 crc kubenswrapper[4546]: I0201 09:04:04.076896 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerDied","Data":"9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224"} Feb 01 09:04:05 crc kubenswrapper[4546]: I0201 09:04:05.086983 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerStarted","Data":"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a"} Feb 01 09:04:05 crc kubenswrapper[4546]: I0201 09:04:05.655136 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:04:05 crc kubenswrapper[4546]: E0201 09:04:05.655511 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:04:07 crc kubenswrapper[4546]: I0201 09:04:07.111506 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerDied","Data":"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a"} Feb 01 09:04:07 crc kubenswrapper[4546]: I0201 09:04:07.111460 4546 generic.go:334] "Generic (PLEG): container finished" podID="e75c434f-60d7-4936-b336-1ec6f084889e" containerID="68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a" exitCode=0 Feb 01 09:04:08 crc kubenswrapper[4546]: I0201 09:04:08.127350 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerStarted","Data":"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4"} Feb 01 09:04:12 crc kubenswrapper[4546]: I0201 09:04:12.491947 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:12 crc kubenswrapper[4546]: I0201 09:04:12.493776 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:12 crc kubenswrapper[4546]: I0201 09:04:12.534441 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:12 crc kubenswrapper[4546]: I0201 09:04:12.558217 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qflc" podStartSLOduration=7.057560057 podStartE2EDuration="10.5582s" podCreationTimestamp="2026-02-01 09:04:02 +0000 UTC" firstStartedPulling="2026-02-01 09:04:04.079577533 +0000 UTC m=+8474.730513549" lastFinishedPulling="2026-02-01 09:04:07.580217476 +0000 UTC m=+8478.231153492" observedRunningTime="2026-02-01 09:04:08.157904701 +0000 UTC m=+8478.808840716" watchObservedRunningTime="2026-02-01 09:04:12.5582 +0000 UTC m=+8483.209136016" Feb 01 09:04:13 crc kubenswrapper[4546]: I0201 09:04:13.220957 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:13 crc kubenswrapper[4546]: I0201 09:04:13.321247 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.186789 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qflc" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="registry-server" containerID="cri-o://56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4" gracePeriod=2 Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.692065 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.816597 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities\") pod \"e75c434f-60d7-4936-b336-1ec6f084889e\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.816811 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content\") pod \"e75c434f-60d7-4936-b336-1ec6f084889e\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.816841 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8fqd\" (UniqueName: \"kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd\") pod \"e75c434f-60d7-4936-b336-1ec6f084889e\" (UID: \"e75c434f-60d7-4936-b336-1ec6f084889e\") " Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.817977 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities" (OuterVolumeSpecName: "utilities") pod "e75c434f-60d7-4936-b336-1ec6f084889e" (UID: "e75c434f-60d7-4936-b336-1ec6f084889e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.818821 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.826115 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd" (OuterVolumeSpecName: "kube-api-access-n8fqd") pod "e75c434f-60d7-4936-b336-1ec6f084889e" (UID: "e75c434f-60d7-4936-b336-1ec6f084889e"). InnerVolumeSpecName "kube-api-access-n8fqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.875199 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e75c434f-60d7-4936-b336-1ec6f084889e" (UID: "e75c434f-60d7-4936-b336-1ec6f084889e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.922636 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c434f-60d7-4936-b336-1ec6f084889e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:15 crc kubenswrapper[4546]: I0201 09:04:15.922671 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8fqd\" (UniqueName: \"kubernetes.io/projected/e75c434f-60d7-4936-b336-1ec6f084889e-kube-api-access-n8fqd\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.200716 4546 generic.go:334] "Generic (PLEG): container finished" podID="e75c434f-60d7-4936-b336-1ec6f084889e" containerID="56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4" exitCode=0 Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.200782 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerDied","Data":"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4"} Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.200821 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qflc" event={"ID":"e75c434f-60d7-4936-b336-1ec6f084889e","Type":"ContainerDied","Data":"1b656c26e0df67401fa8388b237c8387231af6e8dbf44d0d78056ca0f45aa4da"} Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.200845 4546 scope.go:117] "RemoveContainer" containerID="56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.201119 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qflc" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.241916 4546 scope.go:117] "RemoveContainer" containerID="68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.248717 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.260146 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qflc"] Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.295585 4546 scope.go:117] "RemoveContainer" containerID="9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.324402 4546 scope.go:117] "RemoveContainer" containerID="56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4" Feb 01 09:04:16 crc kubenswrapper[4546]: E0201 09:04:16.324928 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4\": container with ID starting with 56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4 not found: ID does not exist" containerID="56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.324968 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4"} err="failed to get container status \"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4\": rpc error: code = NotFound desc = could not find container \"56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4\": container with ID starting with 56df71607c5dad223806a8560e97c165b30253c6c8e4cec0427c303badbe5db4 not found: ID does not exist" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.324992 4546 scope.go:117] "RemoveContainer" containerID="68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a" Feb 01 09:04:16 crc kubenswrapper[4546]: E0201 09:04:16.325307 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a\": container with ID starting with 68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a not found: ID does not exist" containerID="68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.325334 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a"} err="failed to get container status \"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a\": rpc error: code = NotFound desc = could not find container \"68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a\": container with ID starting with 68ca308c9ff4bc9783c98b588690b8e9bbd1497e1653e145a38afbb288f00a9a not found: ID does not exist" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.325351 4546 scope.go:117] "RemoveContainer" containerID="9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224" Feb 01 09:04:16 crc kubenswrapper[4546]: E0201 09:04:16.325672 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224\": container with ID starting with 9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224 not found: ID does not exist" containerID="9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224" Feb 01 09:04:16 crc kubenswrapper[4546]: I0201 09:04:16.325695 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224"} err="failed to get container status \"9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224\": rpc error: code = NotFound desc = could not find container \"9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224\": container with ID starting with 9dd57d7575377832cc3864b06b5adfc4587f9f4954a7c74475c771d7d2734224 not found: ID does not exist" Feb 01 09:04:17 crc kubenswrapper[4546]: I0201 09:04:17.667615 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" path="/var/lib/kubelet/pods/e75c434f-60d7-4936-b336-1ec6f084889e/volumes" Feb 01 09:04:20 crc kubenswrapper[4546]: I0201 09:04:20.654738 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:04:20 crc kubenswrapper[4546]: E0201 09:04:20.655620 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:04:32 crc kubenswrapper[4546]: I0201 09:04:32.655301 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:04:32 crc kubenswrapper[4546]: E0201 09:04:32.656248 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:04:43 crc kubenswrapper[4546]: I0201 09:04:43.655045 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:04:43 crc kubenswrapper[4546]: E0201 09:04:43.656168 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:04:55 crc kubenswrapper[4546]: I0201 09:04:55.655791 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:04:55 crc kubenswrapper[4546]: E0201 09:04:55.657336 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:05:08 crc kubenswrapper[4546]: I0201 09:05:08.655185 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:05:08 crc kubenswrapper[4546]: E0201 09:05:08.657056 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:05:20 crc kubenswrapper[4546]: I0201 09:05:20.654940 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:05:20 crc kubenswrapper[4546]: E0201 09:05:20.655667 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:05:31 crc kubenswrapper[4546]: I0201 09:05:31.655115 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:05:31 crc kubenswrapper[4546]: E0201 09:05:31.656179 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:05:44 crc kubenswrapper[4546]: I0201 09:05:44.655019 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:05:44 crc kubenswrapper[4546]: E0201 09:05:44.657074 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:05:57 crc kubenswrapper[4546]: I0201 09:05:57.655004 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:05:57 crc kubenswrapper[4546]: E0201 09:05:57.656170 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.451773 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:03 crc kubenswrapper[4546]: E0201 09:06:03.453071 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="registry-server" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.453087 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="registry-server" Feb 01 09:06:03 crc kubenswrapper[4546]: E0201 09:06:03.453105 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="extract-content" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.453111 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="extract-content" Feb 01 09:06:03 crc kubenswrapper[4546]: E0201 09:06:03.453120 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="extract-utilities" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.453125 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="extract-utilities" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.453541 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75c434f-60d7-4936-b336-1ec6f084889e" containerName="registry-server" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.454988 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.466281 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.518293 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.518432 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlmp\" (UniqueName: \"kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.518717 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.621448 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.621647 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.621730 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlmp\" (UniqueName: \"kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.622696 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.622817 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.640172 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlmp\" (UniqueName: \"kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp\") pod \"redhat-operators-zlqrm\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:03 crc kubenswrapper[4546]: I0201 09:06:03.775309 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:04 crc kubenswrapper[4546]: I0201 09:06:04.349284 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:05 crc kubenswrapper[4546]: I0201 09:06:05.221140 4546 generic.go:334] "Generic (PLEG): container finished" podID="889475ab-a942-4090-8f52-5e46ae933dad" containerID="e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307" exitCode=0 Feb 01 09:06:05 crc kubenswrapper[4546]: I0201 09:06:05.221361 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerDied","Data":"e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307"} Feb 01 09:06:05 crc kubenswrapper[4546]: I0201 09:06:05.221448 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerStarted","Data":"3715862471cc1e4287f78ebb81dbb25a96f3609d74f77ef809f26d503333f02f"} Feb 01 09:06:06 crc kubenswrapper[4546]: I0201 09:06:06.230197 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerStarted","Data":"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840"} Feb 01 09:06:08 crc kubenswrapper[4546]: I0201 09:06:08.656446 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:06:08 crc kubenswrapper[4546]: E0201 09:06:08.657128 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:06:09 crc kubenswrapper[4546]: I0201 09:06:09.262666 4546 generic.go:334] "Generic (PLEG): container finished" podID="889475ab-a942-4090-8f52-5e46ae933dad" containerID="d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840" exitCode=0 Feb 01 09:06:09 crc kubenswrapper[4546]: I0201 09:06:09.262904 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerDied","Data":"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840"} Feb 01 09:06:10 crc kubenswrapper[4546]: I0201 09:06:10.287400 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerStarted","Data":"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5"} Feb 01 09:06:10 crc kubenswrapper[4546]: I0201 09:06:10.316484 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zlqrm" podStartSLOduration=2.749572464 podStartE2EDuration="7.315609029s" podCreationTimestamp="2026-02-01 09:06:03 +0000 UTC" firstStartedPulling="2026-02-01 09:06:05.224500466 +0000 UTC m=+8595.875436482" lastFinishedPulling="2026-02-01 09:06:09.790537031 +0000 UTC m=+8600.441473047" observedRunningTime="2026-02-01 09:06:10.309052958 +0000 UTC m=+8600.959988965" watchObservedRunningTime="2026-02-01 09:06:10.315609029 +0000 UTC m=+8600.966545045" Feb 01 09:06:13 crc kubenswrapper[4546]: I0201 09:06:13.775667 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:13 crc kubenswrapper[4546]: I0201 09:06:13.777467 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:14 crc kubenswrapper[4546]: I0201 09:06:14.821713 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zlqrm" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" probeResult="failure" output=< Feb 01 09:06:14 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 09:06:14 crc kubenswrapper[4546]: > Feb 01 09:06:21 crc kubenswrapper[4546]: I0201 09:06:21.655539 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:06:21 crc kubenswrapper[4546]: E0201 09:06:21.656589 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:06:24 crc kubenswrapper[4546]: I0201 09:06:24.816508 4546 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zlqrm" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" probeResult="failure" output=< Feb 01 09:06:24 crc kubenswrapper[4546]: timeout: failed to connect service ":50051" within 1s Feb 01 09:06:24 crc kubenswrapper[4546]: > Feb 01 09:06:33 crc kubenswrapper[4546]: I0201 09:06:33.817940 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:33 crc kubenswrapper[4546]: I0201 09:06:33.861029 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:34 crc kubenswrapper[4546]: I0201 09:06:34.638568 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:34 crc kubenswrapper[4546]: I0201 09:06:34.654582 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:06:34 crc kubenswrapper[4546]: E0201 09:06:34.654828 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:06:35 crc kubenswrapper[4546]: I0201 09:06:35.543881 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zlqrm" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" containerID="cri-o://1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5" gracePeriod=2 Feb 01 09:06:35 crc kubenswrapper[4546]: E0201 09:06:35.735077 4546 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889475ab_a942_4090_8f52_5e46ae933dad.slice/crio-1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5.scope\": RecentStats: unable to find data in memory cache]" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.191778 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.279918 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzlmp\" (UniqueName: \"kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp\") pod \"889475ab-a942-4090-8f52-5e46ae933dad\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.279982 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content\") pod \"889475ab-a942-4090-8f52-5e46ae933dad\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.280048 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities\") pod \"889475ab-a942-4090-8f52-5e46ae933dad\" (UID: \"889475ab-a942-4090-8f52-5e46ae933dad\") " Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.287774 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities" (OuterVolumeSpecName: "utilities") pod "889475ab-a942-4090-8f52-5e46ae933dad" (UID: "889475ab-a942-4090-8f52-5e46ae933dad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.301894 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp" (OuterVolumeSpecName: "kube-api-access-gzlmp") pod "889475ab-a942-4090-8f52-5e46ae933dad" (UID: "889475ab-a942-4090-8f52-5e46ae933dad"). InnerVolumeSpecName "kube-api-access-gzlmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.384028 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzlmp\" (UniqueName: \"kubernetes.io/projected/889475ab-a942-4090-8f52-5e46ae933dad-kube-api-access-gzlmp\") on node \"crc\" DevicePath \"\"" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.384150 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.410165 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "889475ab-a942-4090-8f52-5e46ae933dad" (UID: "889475ab-a942-4090-8f52-5e46ae933dad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.486219 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/889475ab-a942-4090-8f52-5e46ae933dad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.554472 4546 generic.go:334] "Generic (PLEG): container finished" podID="889475ab-a942-4090-8f52-5e46ae933dad" containerID="1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5" exitCode=0 Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.554523 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerDied","Data":"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5"} Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.554545 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlqrm" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.554561 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlqrm" event={"ID":"889475ab-a942-4090-8f52-5e46ae933dad","Type":"ContainerDied","Data":"3715862471cc1e4287f78ebb81dbb25a96f3609d74f77ef809f26d503333f02f"} Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.554580 4546 scope.go:117] "RemoveContainer" containerID="1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.583292 4546 scope.go:117] "RemoveContainer" containerID="d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.589491 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.610166 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zlqrm"] Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.623331 4546 scope.go:117] "RemoveContainer" containerID="e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.643563 4546 scope.go:117] "RemoveContainer" containerID="1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5" Feb 01 09:06:36 crc kubenswrapper[4546]: E0201 09:06:36.644295 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5\": container with ID starting with 1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5 not found: ID does not exist" containerID="1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.644342 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5"} err="failed to get container status \"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5\": rpc error: code = NotFound desc = could not find container \"1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5\": container with ID starting with 1915bf9af411d1c59c81911252de8e38e0fccf747a412af01bae0dda8c2d90b5 not found: ID does not exist" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.644376 4546 scope.go:117] "RemoveContainer" containerID="d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840" Feb 01 09:06:36 crc kubenswrapper[4546]: E0201 09:06:36.644737 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840\": container with ID starting with d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840 not found: ID does not exist" containerID="d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.644760 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840"} err="failed to get container status \"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840\": rpc error: code = NotFound desc = could not find container \"d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840\": container with ID starting with d0cd893acfdc71c55576401f4f36deddfd49b65c73ebcf295feb447b50f2e840 not found: ID does not exist" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.644777 4546 scope.go:117] "RemoveContainer" containerID="e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307" Feb 01 09:06:36 crc kubenswrapper[4546]: E0201 09:06:36.645151 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307\": container with ID starting with e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307 not found: ID does not exist" containerID="e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307" Feb 01 09:06:36 crc kubenswrapper[4546]: I0201 09:06:36.645251 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307"} err="failed to get container status \"e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307\": rpc error: code = NotFound desc = could not find container \"e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307\": container with ID starting with e5f089d93ac7b3ba3cc0ca3d538169ac552d1b1dbc664470fab0b33dfc9c5307 not found: ID does not exist" Feb 01 09:06:37 crc kubenswrapper[4546]: I0201 09:06:37.665592 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889475ab-a942-4090-8f52-5e46ae933dad" path="/var/lib/kubelet/pods/889475ab-a942-4090-8f52-5e46ae933dad/volumes" Feb 01 09:06:47 crc kubenswrapper[4546]: I0201 09:06:47.655638 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:06:47 crc kubenswrapper[4546]: E0201 09:06:47.657678 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:07:00 crc kubenswrapper[4546]: I0201 09:07:00.655224 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:07:00 crc kubenswrapper[4546]: E0201 09:07:00.656343 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:07:13 crc kubenswrapper[4546]: I0201 09:07:13.655668 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:07:13 crc kubenswrapper[4546]: E0201 09:07:13.656751 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:07:25 crc kubenswrapper[4546]: I0201 09:07:25.655289 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:07:25 crc kubenswrapper[4546]: E0201 09:07:25.656259 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:07:36 crc kubenswrapper[4546]: I0201 09:07:36.655643 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:07:36 crc kubenswrapper[4546]: E0201 09:07:36.656870 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:07:47 crc kubenswrapper[4546]: I0201 09:07:47.656168 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:07:47 crc kubenswrapper[4546]: E0201 09:07:47.660332 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:08:00 crc kubenswrapper[4546]: I0201 09:08:00.655965 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:08:00 crc kubenswrapper[4546]: E0201 09:08:00.657440 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:08:15 crc kubenswrapper[4546]: I0201 09:08:15.670598 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:08:15 crc kubenswrapper[4546]: E0201 09:08:15.690111 4546 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dwtsx_openshift-machine-config-operator(a4316448-1833-40f9-bdd7-e13d7dd4da6b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" Feb 01 09:08:28 crc kubenswrapper[4546]: I0201 09:08:28.655237 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:08:29 crc kubenswrapper[4546]: I0201 09:08:29.596675 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"c49cd4930c9b52a4ad3be000e6dedc9cf775e545e655285d2773dff7dceb2925"} Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.168826 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:34 crc kubenswrapper[4546]: E0201 09:08:34.172056 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="extract-content" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.172096 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="extract-content" Feb 01 09:08:34 crc kubenswrapper[4546]: E0201 09:08:34.172509 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="extract-utilities" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.172533 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="extract-utilities" Feb 01 09:08:34 crc kubenswrapper[4546]: E0201 09:08:34.172597 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.172650 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.173733 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="889475ab-a942-4090-8f52-5e46ae933dad" containerName="registry-server" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.178989 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.192537 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmz8\" (UniqueName: \"kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.192730 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.192925 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.198827 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.294322 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.294408 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.294565 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmz8\" (UniqueName: \"kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.296216 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.296490 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.318981 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmz8\" (UniqueName: \"kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8\") pod \"redhat-marketplace-pzhx6\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:34 crc kubenswrapper[4546]: I0201 09:08:34.500341 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:35 crc kubenswrapper[4546]: I0201 09:08:35.242332 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:35 crc kubenswrapper[4546]: I0201 09:08:35.656631 4546 generic.go:334] "Generic (PLEG): container finished" podID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerID="cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc" exitCode=0 Feb 01 09:08:35 crc kubenswrapper[4546]: I0201 09:08:35.660224 4546 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:08:35 crc kubenswrapper[4546]: I0201 09:08:35.666938 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerDied","Data":"cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc"} Feb 01 09:08:35 crc kubenswrapper[4546]: I0201 09:08:35.666982 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerStarted","Data":"c033aa61347378e7f3c09ac0f78f0acf85c24179854e56ae621c2d92134e3663"} Feb 01 09:08:36 crc kubenswrapper[4546]: I0201 09:08:36.667536 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerStarted","Data":"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9"} Feb 01 09:08:37 crc kubenswrapper[4546]: I0201 09:08:37.676694 4546 generic.go:334] "Generic (PLEG): container finished" podID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerID="a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9" exitCode=0 Feb 01 09:08:37 crc kubenswrapper[4546]: I0201 09:08:37.676788 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerDied","Data":"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9"} Feb 01 09:08:38 crc kubenswrapper[4546]: I0201 09:08:38.688923 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerStarted","Data":"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80"} Feb 01 09:08:38 crc kubenswrapper[4546]: I0201 09:08:38.712648 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzhx6" podStartSLOduration=2.109856529 podStartE2EDuration="4.71171174s" podCreationTimestamp="2026-02-01 09:08:34 +0000 UTC" firstStartedPulling="2026-02-01 09:08:35.657626788 +0000 UTC m=+8746.308562804" lastFinishedPulling="2026-02-01 09:08:38.259481999 +0000 UTC m=+8748.910418015" observedRunningTime="2026-02-01 09:08:38.708813306 +0000 UTC m=+8749.359749322" watchObservedRunningTime="2026-02-01 09:08:38.71171174 +0000 UTC m=+8749.362647756" Feb 01 09:08:44 crc kubenswrapper[4546]: I0201 09:08:44.500903 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:44 crc kubenswrapper[4546]: I0201 09:08:44.501553 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:44 crc kubenswrapper[4546]: I0201 09:08:44.544643 4546 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:44 crc kubenswrapper[4546]: I0201 09:08:44.769913 4546 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:44 crc kubenswrapper[4546]: I0201 09:08:44.826316 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:46 crc kubenswrapper[4546]: I0201 09:08:46.751762 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzhx6" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="registry-server" containerID="cri-o://192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80" gracePeriod=2 Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.202274 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.382281 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content\") pod \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.382519 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwmz8\" (UniqueName: \"kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8\") pod \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.382806 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities\") pod \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\" (UID: \"ab641acb-18f4-4317-8a24-bb5d2de8a2ff\") " Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.384201 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities" (OuterVolumeSpecName: "utilities") pod "ab641acb-18f4-4317-8a24-bb5d2de8a2ff" (UID: "ab641acb-18f4-4317-8a24-bb5d2de8a2ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.398255 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8" (OuterVolumeSpecName: "kube-api-access-xwmz8") pod "ab641acb-18f4-4317-8a24-bb5d2de8a2ff" (UID: "ab641acb-18f4-4317-8a24-bb5d2de8a2ff"). InnerVolumeSpecName "kube-api-access-xwmz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.401615 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab641acb-18f4-4317-8a24-bb5d2de8a2ff" (UID: "ab641acb-18f4-4317-8a24-bb5d2de8a2ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.485773 4546 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.486069 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwmz8\" (UniqueName: \"kubernetes.io/projected/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-kube-api-access-xwmz8\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.486080 4546 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab641acb-18f4-4317-8a24-bb5d2de8a2ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.758958 4546 generic.go:334] "Generic (PLEG): container finished" podID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerID="192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80" exitCode=0 Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.758996 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerDied","Data":"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80"} Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.759010 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzhx6" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.759022 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzhx6" event={"ID":"ab641acb-18f4-4317-8a24-bb5d2de8a2ff","Type":"ContainerDied","Data":"c033aa61347378e7f3c09ac0f78f0acf85c24179854e56ae621c2d92134e3663"} Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.759040 4546 scope.go:117] "RemoveContainer" containerID="192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.776167 4546 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.784320 4546 scope.go:117] "RemoveContainer" containerID="a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.784901 4546 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzhx6"] Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.803462 4546 scope.go:117] "RemoveContainer" containerID="cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.841066 4546 scope.go:117] "RemoveContainer" containerID="192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80" Feb 01 09:08:47 crc kubenswrapper[4546]: E0201 09:08:47.842622 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80\": container with ID starting with 192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80 not found: ID does not exist" containerID="192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.843278 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80"} err="failed to get container status \"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80\": rpc error: code = NotFound desc = could not find container \"192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80\": container with ID starting with 192a906e164395cd37a5a760648e4fadd6e07f811098a862ded43a23fada7c80 not found: ID does not exist" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.843307 4546 scope.go:117] "RemoveContainer" containerID="a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9" Feb 01 09:08:47 crc kubenswrapper[4546]: E0201 09:08:47.843894 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9\": container with ID starting with a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9 not found: ID does not exist" containerID="a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.843917 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9"} err="failed to get container status \"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9\": rpc error: code = NotFound desc = could not find container \"a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9\": container with ID starting with a496838abc074342c76ff3bd0f10b63b01835ea2d4fe00561aca0fee8b394ac9 not found: ID does not exist" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.843933 4546 scope.go:117] "RemoveContainer" containerID="cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc" Feb 01 09:08:47 crc kubenswrapper[4546]: E0201 09:08:47.844211 4546 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc\": container with ID starting with cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc not found: ID does not exist" containerID="cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc" Feb 01 09:08:47 crc kubenswrapper[4546]: I0201 09:08:47.844258 4546 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc"} err="failed to get container status \"cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc\": rpc error: code = NotFound desc = could not find container \"cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc\": container with ID starting with cec81694899dc00c9aa6f8a343f838b4053c90f3e28aba2c653be2c28cafb0cc not found: ID does not exist" Feb 01 09:08:49 crc kubenswrapper[4546]: I0201 09:08:49.667181 4546 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" path="/var/lib/kubelet/pods/ab641acb-18f4-4317-8a24-bb5d2de8a2ff/volumes" Feb 01 09:10:55 crc kubenswrapper[4546]: I0201 09:10:55.420551 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:10:55 crc kubenswrapper[4546]: I0201 09:10:55.421820 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:11:22 crc kubenswrapper[4546]: I0201 09:11:22.136726 4546 generic.go:334] "Generic (PLEG): container finished" podID="297ca525-97ec-433c-82ec-01cf98fb4c52" containerID="7a91d65171a04094906210eccb8074bba06bad8108d8412ec982eacb6b48bc97" exitCode=0 Feb 01 09:11:22 crc kubenswrapper[4546]: I0201 09:11:22.136809 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"297ca525-97ec-433c-82ec-01cf98fb4c52","Type":"ContainerDied","Data":"7a91d65171a04094906210eccb8074bba06bad8108d8412ec982eacb6b48bc97"} Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.698546 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781211 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781316 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781499 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781633 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781800 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781919 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.781997 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwlzl\" (UniqueName: \"kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.782104 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.782252 4546 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir\") pod \"297ca525-97ec-433c-82ec-01cf98fb4c52\" (UID: \"297ca525-97ec-433c-82ec-01cf98fb4c52\") " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.783440 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.784120 4546 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.786774 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data" (OuterVolumeSpecName: "config-data") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.794175 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.794338 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.800798 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl" (OuterVolumeSpecName: "kube-api-access-mwlzl") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "kube-api-access-mwlzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.820010 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.820851 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.823182 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.843114 4546 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "297ca525-97ec-433c-82ec-01cf98fb4c52" (UID: "297ca525-97ec-433c-82ec-01cf98fb4c52"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.888083 4546 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/297ca525-97ec-433c-82ec-01cf98fb4c52-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.888483 4546 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.888565 4546 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.888715 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.888736 4546 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.889265 4546 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.889373 4546 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwlzl\" (UniqueName: \"kubernetes.io/projected/297ca525-97ec-433c-82ec-01cf98fb4c52-kube-api-access-mwlzl\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.889453 4546 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/297ca525-97ec-433c-82ec-01cf98fb4c52-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.903934 4546 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 01 09:11:23 crc kubenswrapper[4546]: I0201 09:11:23.993703 4546 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:24 crc kubenswrapper[4546]: I0201 09:11:24.159654 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" event={"ID":"297ca525-97ec-433c-82ec-01cf98fb4c52","Type":"ContainerDied","Data":"bab04943ac2ed8931f09153854358dfd3bed550313da79291606936b00890b43"} Feb 01 09:11:24 crc kubenswrapper[4546]: I0201 09:11:24.159696 4546 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-thread-testing" Feb 01 09:11:24 crc kubenswrapper[4546]: I0201 09:11:24.159694 4546 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab04943ac2ed8931f09153854358dfd3bed550313da79291606936b00890b43" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.421069 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.421527 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.608695 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 09:11:25 crc kubenswrapper[4546]: E0201 09:11:25.609243 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="extract-utilities" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609264 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="extract-utilities" Feb 01 09:11:25 crc kubenswrapper[4546]: E0201 09:11:25.609289 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="extract-content" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609296 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="extract-content" Feb 01 09:11:25 crc kubenswrapper[4546]: E0201 09:11:25.609313 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="registry-server" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609320 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="registry-server" Feb 01 09:11:25 crc kubenswrapper[4546]: E0201 09:11:25.609332 4546 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297ca525-97ec-433c-82ec-01cf98fb4c52" containerName="tempest-tests-tempest-tests-runner" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609339 4546 state_mem.go:107] "Deleted CPUSet assignment" podUID="297ca525-97ec-433c-82ec-01cf98fb4c52" containerName="tempest-tests-tempest-tests-runner" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609600 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="297ca525-97ec-433c-82ec-01cf98fb4c52" containerName="tempest-tests-tempest-tests-runner" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.609637 4546 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab641acb-18f4-4317-8a24-bb5d2de8a2ff" containerName="registry-server" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.610369 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.616988 4546 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-7sw8w" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.657863 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.748933 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.748989 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsjbn\" (UniqueName: \"kubernetes.io/projected/a9ee3afa-d2fc-415e-ba09-03e91352f754-kube-api-access-jsjbn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.853264 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.853319 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsjbn\" (UniqueName: \"kubernetes.io/projected/a9ee3afa-d2fc-415e-ba09-03e91352f754-kube-api-access-jsjbn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.854469 4546 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.873478 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsjbn\" (UniqueName: \"kubernetes.io/projected/a9ee3afa-d2fc-415e-ba09-03e91352f754-kube-api-access-jsjbn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.879005 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9ee3afa-d2fc-415e-ba09-03e91352f754\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:25 crc kubenswrapper[4546]: I0201 09:11:25.932790 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 09:11:26 crc kubenswrapper[4546]: I0201 09:11:26.377146 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 09:11:26 crc kubenswrapper[4546]: W0201 09:11:26.391584 4546 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ee3afa_d2fc_415e_ba09_03e91352f754.slice/crio-f1bfb797abb8385352824147aac3b932a917aae6dbbc544cec84d8526cbe9749 WatchSource:0}: Error finding container f1bfb797abb8385352824147aac3b932a917aae6dbbc544cec84d8526cbe9749: Status 404 returned error can't find the container with id f1bfb797abb8385352824147aac3b932a917aae6dbbc544cec84d8526cbe9749 Feb 01 09:11:27 crc kubenswrapper[4546]: I0201 09:11:27.190713 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9ee3afa-d2fc-415e-ba09-03e91352f754","Type":"ContainerStarted","Data":"f1bfb797abb8385352824147aac3b932a917aae6dbbc544cec84d8526cbe9749"} Feb 01 09:11:28 crc kubenswrapper[4546]: I0201 09:11:28.202763 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9ee3afa-d2fc-415e-ba09-03e91352f754","Type":"ContainerStarted","Data":"100b4fb8bfd3779f2bbfdd1c8ec4cbad44c6abe1e6b164b7e3cd9faa55f693c8"} Feb 01 09:11:28 crc kubenswrapper[4546]: I0201 09:11:28.223615 4546 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.175574582 podStartE2EDuration="3.223594175s" podCreationTimestamp="2026-02-01 09:11:25 +0000 UTC" firstStartedPulling="2026-02-01 09:11:26.396951345 +0000 UTC m=+8917.047887362" lastFinishedPulling="2026-02-01 09:11:27.444970939 +0000 UTC m=+8918.095906955" observedRunningTime="2026-02-01 09:11:28.217123184 +0000 UTC m=+8918.868059200" watchObservedRunningTime="2026-02-01 09:11:28.223594175 +0000 UTC m=+8918.874530190" Feb 01 09:11:55 crc kubenswrapper[4546]: I0201 09:11:55.421126 4546 patch_prober.go:28] interesting pod/machine-config-daemon-dwtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:11:55 crc kubenswrapper[4546]: I0201 09:11:55.421727 4546 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:11:55 crc kubenswrapper[4546]: I0201 09:11:55.421767 4546 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" Feb 01 09:11:55 crc kubenswrapper[4546]: I0201 09:11:55.422494 4546 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c49cd4930c9b52a4ad3be000e6dedc9cf775e545e655285d2773dff7dceb2925"} pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:11:55 crc kubenswrapper[4546]: I0201 09:11:55.422548 4546 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" podUID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerName="machine-config-daemon" containerID="cri-o://c49cd4930c9b52a4ad3be000e6dedc9cf775e545e655285d2773dff7dceb2925" gracePeriod=600 Feb 01 09:11:56 crc kubenswrapper[4546]: I0201 09:11:56.452911 4546 generic.go:334] "Generic (PLEG): container finished" podID="a4316448-1833-40f9-bdd7-e13d7dd4da6b" containerID="c49cd4930c9b52a4ad3be000e6dedc9cf775e545e655285d2773dff7dceb2925" exitCode=0 Feb 01 09:11:56 crc kubenswrapper[4546]: I0201 09:11:56.452992 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerDied","Data":"c49cd4930c9b52a4ad3be000e6dedc9cf775e545e655285d2773dff7dceb2925"} Feb 01 09:11:56 crc kubenswrapper[4546]: I0201 09:11:56.453475 4546 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dwtsx" event={"ID":"a4316448-1833-40f9-bdd7-e13d7dd4da6b","Type":"ContainerStarted","Data":"f366bd5d006ff35d5ee36333a651149992a5c414ae517f384e907f7856f988a8"} Feb 01 09:11:56 crc kubenswrapper[4546]: I0201 09:11:56.453496 4546 scope.go:117] "RemoveContainer" containerID="92d115905a92c9583abb27664fe6ca4ace88800a065e25d89703d5c9bc1d8e86" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.670939 4546 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5mjgm"] Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.673637 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.682637 4546 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5mjgm"] Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.735452 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqpb\" (UniqueName: \"kubernetes.io/projected/c4e7592c-3b2c-473d-b5db-8fed06047a08-kube-api-access-dkqpb\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.736024 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-utilities\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.736230 4546 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-catalog-content\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.838361 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-catalog-content\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.838507 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqpb\" (UniqueName: \"kubernetes.io/projected/c4e7592c-3b2c-473d-b5db-8fed06047a08-kube-api-access-dkqpb\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.838751 4546 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-utilities\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.839612 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-catalog-content\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.840220 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4e7592c-3b2c-473d-b5db-8fed06047a08-utilities\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:10 crc kubenswrapper[4546]: I0201 09:13:10.866115 4546 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqpb\" (UniqueName: \"kubernetes.io/projected/c4e7592c-3b2c-473d-b5db-8fed06047a08-kube-api-access-dkqpb\") pod \"community-operators-5mjgm\" (UID: \"c4e7592c-3b2c-473d-b5db-8fed06047a08\") " pod="openshift-marketplace/community-operators-5mjgm" Feb 01 09:13:11 crc kubenswrapper[4546]: I0201 09:13:11.006369 4546 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5mjgm"